Category Archives: Cloud Computing

High Performance Scientific Computing on the Cloud?

What is this blog about?

Scientific Computing has been dominated and owned by elite scientists and engineers from premium universities and large engineering and aeronautical companies. There was little scope for students and enthusiasts to understand and learn:

With the advent of Amazon, Rackspace, Penguin as Cloud leaders, has this changed? Let us explore in this blog. This blog targets Geeks like me who want to setup an opensource solver on the HPC cloud.

Background on what I have been doing:

From past one year, I have been working with a worlds leading aeronautical company, automating their aircraft analysis and design workflow processes. There were lot of security restrictions to access any of their Solvers or understand how their High Performance Computing Servers work. This is very near to my background in mechanical engineering and a while ago had done a project in CFD and Finite Element Analysis using Ansys, I wanted to understand what is going on. Being a geek, I was curious to understand the nuts and bolts of High Performance Scientific Computing. Below is what I understood so far.

Nuts and Bolts of High Performance Scientific Computing


There are lot of open source solvers, there is a good discussion on, Why isn’t open source CFD solution for everyone?. One of the leading one is Stanford University Unstructured (SU2) solver. SU2 is back bone of some of the cloudbased tools like SimScale. Interestingly the sourcecode is in Github. So let us dive into it. It will take hardly couple of hrs to set this solver on Ubuntu Linux and run a simple airfoil mesh generation and view the airfoil on an opensource plot viewer, ParaView.

For quick start, you need gc++ and make utility. Git clone SU2_EDU source code and Run the SU2_EDU first, the instructions are provided in the link itself. Before running, go to bin directory and do few tweaks to the configuration file ConfigFile_RANS.cfg as below.


#run command ./SU2_EDU
# Select option1
# Select the file airfoil_rae2822_lednicer.dat
# it will run for 2000 iterations and creates flow.vtk and solver_flow.vtk files.

once your run is complete, open ParaView GUI and load the flow.vtk and solver_flow.vtk files and you can see the airfoil mesh and volume grids.

Cloud based HPC:

A quick googling will show there are 2 leaders in HPC on the cloud, Rackspace and Penguin Computing. Penguin has a free plan to run a 5min HPC job. You just need to register and run the job to experience how to run a Solver on HPC.

Integrating Solver with HPC:

If you want to run the solver in a Cloud based HPC server the key is to re-build su2 with OpenMPI capabilities as below,

./configure --with-MPI=mpicxx --with-Metis-lib=/usr/local/metis-4.0.3 --with-Metis-include=/usr/local/metis-4.0.3/Lib

Once you build it, you can upload the binaries on the cloud and run the solver in as many CPU’s as you can, the response time will be considerable good. one way to test the details of the job is,

qstat -w <the PBS id returned>

Final Verdict

It is absolutely possible to setup a open source decent solver on the HPC cloud and run few solutions. Hope this blog was helpful.

Physics Engine and HTML5 Canvas

Intent of this blog space: The intent of this blog space is to build a testbed for testing various concept of Physics Engine and modelling computer graphical elements and visually testing them using any device by using Physics Engine and HTML5 Canvas. For the people in hurry,

  • Clink here to see the progress of the testbed
  • Get the latest code of my Github and follow how to setup the application and run locally.

In this blog I would be discussing about what is Physics Engine and what are the various frameworks that supports Physics Engine. I will also be discussing technologies that supports Physics Engine to support HTML5 Canvas. What is Physics Engine? Physics Engine is a computer simulation tool that is used extensively in Game Development, Education Purpose and various applications. A quick YouTube search will yield various links as an application of Box2D. This tool aids in applying all the Physical laws like Gravity, Friction, Force on object displayed on the screen. There are lot of different tools and languages that supports Physics Engine. Box2D API is one of the standard frameworks that is supported in various programming languages like, C++, Java, Javascript, Clojure and Python. And all these tools also support a testbed to write application using Physics Engine and test them in a GUI. There is also a IDE build around Physics Engine called iforce2d RUBE. What is HTML5 Canvas? HTML5 Canvas will become defacto standard for display graphics on the web page. The closest competitor for this is Flash, WebGL. In the next few blogs I will discuss about the architecture of how I built the application, stay tuned.

Meet Amazon EC2: Bigdata on the Cloud

I have been exploring Amazon EC2 as a Cloud based alternative to Midsized Software Product Company. I stumbled across this Slideshow Presentation from Netflix, a frontier in running their entire IT on Amazon’s EC2 platform. As per Netflix, they don’t have any data center, amazing isn’t it?

So I started exploring Amazon EC2 and how a company can run their entire IT in Amazon EC2. I was more interested in the technology stand point.

For a starter, in Amazon you can create various Linux instances including Ubuntu for free. They are elastic servers, where you can increase the RAM, Processing power on demand. Once you setup the instance, you can ssh on to the machine and do pretty much whatever you want. Refer this youtube link for how to setup Amazon.

As per Amazon, in a month, you get 750hrs free server usage, in simple words, that is plenty for testing your business idea. There is standard Amazon Machine Images (AMI) which has various pre-configured stacks including LAMP. Developing a decent Web application and exposing to the users is easy.

The interesting thing I noticed, it does have good Hadoop, MapReduce support. For more details of how to setup Hadoop in Amazon refer this youtube link. There are few commandline interface (cli) tools to manage EMR.

In Amazon the equivalent of HDFS is s3. Equivalent of Hadoop is Elastic MapReduce.

Cloudfoundry and MongoDB NoSQL sample application

Introduction: MongoDb NoSQL

Download the source code here.

As lot of you folks, I also had a question, what is cloud computing? I started googling and downloaded few tools and played with them and understood few concepts.

In this section I will be discussing about one of the key concepts of Cloud computing – Platform as a Service a.k.a PaaS a.k.a Cloud Platform. Basically a Cloud Platform provide development support in a local environment, and deployment into remote environment and the cloud platform will “introspect” to which webserver/which database the application has to run. Let me illustrate this with a diagram

MongoDb NoSQL

There are few leading players in this space (Heroku, and Microsoft (Azure). Recently VMware entered into this space with their own Cloudfoundry. Coming from Java background this is a good tool to understand the details of a Cloud Platform. It is well integrated with STS IDE and it is bit buggy, but there are workarounds.

This section discusses about,

  1. How Cloudfoundry Cloud Platform supports, WebServer and database
  2. How Cloudfoundry Cloud Platform helps in Database seeding
  3. Deploying to Cloud platform
  4.  ‘glue’ to inform the Cloud Platform about Webserver and Database
  5.  ‘glue’ to generate the dbschema and the Seed data


I will walk you thru a simple example using Spring MVC and MongoDB, where you do a basic CRUD operation on person table. MongoDB is a Document based database, which is used to store large amount of data, this also has Map/Reduce capabilities similar to Hadoop.

To quickly start on this,

Cloudfoundry support:

WebServer and database support: Cloudfoundry Supports SpringSource tc Server, it also supports Jetty if used with Maven. Database side,It supports MySQL, vPostgres and MongoDB. It has the ability to introspect spring context file and understand which type of database application supports if you have a bean with type “org.apache.commons.dbcp.BasicDataSource” and it can bind it to the respective database. If it is MongoDB, it needs a mongo-db-factory as shown in this example.

Database seed data population: Typically if you have “jdbc:initialize-database” configuration in your application Cloudfoundry will execute that script in its bound database.

Configuration in the application:

Deploying to Cloudfoundry: There are 2 ways of deploying the code to Cloudfoundry, command line and STS IDE

If you want to deploy and Run the application using VMC command line, you need to do the following,

Move to the target folder
vmc target http://api.{instancename}

vmc push

give the application name as ‘spring-mongodb’

Bind to ‘mongodb’

Save configuration

Now open the browser and type ‘http://spring-mongodb.{instancename}’

If you want to deploy and run the application from STS IDE, you need to do the following,

  • For setting up your STS to work with Cloudfoundry refer this link.
  • Import the maven project into you STS
  • Create a new Cloudfoundry Server and add the spring-mongodb application, and publish the application war to the Cloudfoundry host and see your changes.
  • You can also access the Remote System for error logs and files as mentioned in the SpringSource Blog mentioned above

Glue to inform the Cloudfoundry about the database:

The key changes you have to make in your application to work with Cloudfoundry,

  1. Maven changes
<!-- CloudFoundry -->


  1. Mongodb configuration

Glue to generate the dbschema and the Seed data in Cloudfoundry:

You can create a Bean called InitService with a method init, and add all the seed data that is needed for the application as below,

public class InitService {
 private MongoTemplate mongoTemplate;
 public MongoTemplate getMongoTemplate() {
   return mongoTemplate;
 public void setMongoTemplate(MongoTemplate mongoTemplate) {
  this.mongoTemplate = mongoTemplate;
private void init() {
Person p = new Person ();
 p = new Person ();

In the bean context, define a bean as below,

Other than these changes, everything else is same as any other Spring MVC application.

Way forward, you can get the Cloudfoundry samples.  Using Git utlitiy you can clone this in your local machine. There is a simple hello-spring-mysql sample; you can quickly understand how MySQL based application works using this example.