Archive for the ‘Cloud Computing’ Category

My continued association with Computer Science

August 8, 2015

It has been 27years of my association with “Computer Science” today!

Recently I heard a student making a remark while selecting the course for undergraduate: “What is there in computer science? I can learn java on my own”

My clarification is as follows: “Computer Science” is not just a programming language or the skill of writing an program. A deep understanding of operating systems, memory management, compilers, data structures and algorithms, data storage, compression, encryption and security, parallel processing, analytics and on and on….

Along with the understanding, ability to apply the understanding to implement the algorithms with existing computing resources for solving the problems makes up the study of “Computer Science”

Of late, I have started learning the statistical language “R” and trying to experiment my ability to apply machine learning on some #kaggle challenges. My first submission to a competition: (Currently stand at 1110th position) 

Having moved my regular technology blogging to LinkedIn, my last year’s post: 

Earlier on this blog: 

All my last year posts can be found on LinkedIn:

So, like any other subject, “Computer Science” has lot of depth and breadth, if one wants to explore it!! 

Data streams, lakes, oceans and “Finding Nemo” in them….

April 4, 2014

This weekend, I complete 3 years of TCS second innings. Most of the three years I have been working with large insurance providers trying to figure out the ways to add value with the technology to their operations and strategy.

The concurrent period has been a period of re-imagination. Companies and individuals (consumers / employees) slowly moving towards reimagining themselves in the wake of converging digital forces like cloud computing, analytics & big data, social networking and mobile computing.

Focus of my career in Information Technology has always been “Information” and not technology. I am a firm believer in “Information” led transformation rather than “technology” led transformation. The basis for information is data and the ability to process and interpret the data, making it applicable and relevant for the operational or strategic issues being addressed by the corporate business leaders.

Technologists are busy making claims that their own technology is best suited for the current data processing needs. Storage vendors are finding business in providing the storage in cloud. Mobility providers are betting big on wearable devices making computing more and more pervasive. The big industrial manufacturers are busy fusing sensors everywhere and connecting them on the internet following the trend set by the human social networking sites. A new breed of scientists calling themselves data scientists are inventing algorithms to quickly derive insights from the data that is being collected. Each one of them is pushing themselves to the front taking support of the others to market themselves.

In the rush, there is a distinctive trend in the business houses. The CTO projecting technology as a business growth driver and taking a dominant role is common. The data flows should be plumbed across the IT landscape across various technologies causing a lot of hurried and fast changing plumbing issues.

In my view the data flow should be natural just like streams of water. Information should be flowing naturally in the landscape and technology should be used to make the flow gentle avoiding the floods and tsunamis. Creating data pools in the cloud storage and connecting the pools to form a knowledge ecosystem grow the right insights relevant to the business context remains the big challenge today.

The information architecture in the big data and analytics arena is just like dealing with big rivers and having right reservoirs and connecting them to get best benefit in the landscape. And a CIO is still needed and responsible for this in the corporate.

If data becomes an ocean and insights become an effort like “Finding Nemo” the overall objective may be lost. Cautiously avoiding the data ocean let us keep the (big) data in its pools and lakes as usable information while reimagining data in the current world of re-imagination. This applies to both corporate business houses as well as individuals.

Hoping Innovative reimagination in the digital world helps improve the life in the ecosystems of the real world….

Cloud Architecture Security & Reliability

January 31, 2014

Yesterday, I was doing a presentation at SSIT, Tumkur on Cloud Architecture Security & Reliability to the faculty members of SSIT and SIT Tumkur.

With the advent of Cloud Computing paradigm there are at least five categories of “Actors” emerged.
1. Cloud Consumers, 2. Cloud Providers, 3. Cloud Brokers, 4. Cloud Auditors, 5. Cloud Carriers. The NIST conceptual reference model gives a nice overview of these. ( )

Image description not specified.

The security of more specifically “Information Security” is a cross cutting concern across all these actors. The CSA publishes top threats regularly here. The top threats 2013 are

  1. Data Breaches
  2. Data Loss
  3. Account Hijacking
  4. Insecure APIs
  5. Denial of Service
  6. Malicious Insiders
  7. Abuse of Cloud Services
  8. Insufficient Due Diligence
  9. Shared Technology Issues

All these threats translate to protecting four major areas of Cloud Architecture…

  1. Application Access – Authentication and Authorization
  2. Separation of Concerns – Privileged user access to sensitive data
  3. Key – Management – of encryption keys
  4. Data at Rest – Secure management of copies of data

Interestingly the ENISA threat landscape also points to similar emerging threats related to Cloud Computing –

Image description not specified.

Is there any shortcut to achieve security to any of the actors in the Cloud? I do not think so. The perspective presented by Booz & Co on cloud security has a nice ICT Resilience life clycle that was discussed.

Finally, there was a good discussion on the Reliability and Redundancy. The key aspect was how do we achieve better reliability of a complex IT system consisting of multiple components across multiple layers (i.e., web, application, database) to make best utility of non failing components to share the load while isolating the failure component and decoupling it from the cluster and seamlessly re-balancing the workload to the rest of the working components.

Overall it was a good session to interact with academia!

The slide deck that was used:

Crisscrossing thoughts around #Cloud and #BigData

August 2, 2013

While “Big Data Analytics” is running on Cloud based infrastructure with 1000s of (virtual) servers, Cloud infrastructure management has become a big data problem!

Assuming all key availability and performance metrics need to be collected and processed regularly to keep the cloud infrastructure running within the agreed performance service levels and to identify the trends of demand for the cloud services there is an absolute need for the predictive analytics on the collected metrics data.

As the data centers gradually turn into private clouds with a lot of virtualization, it becomes increasingly important to manage the underlying grid of resources efficiently by allocating the best possible resources to the high priority jobs. The integrated infrastructure monitoring and analytics framework running on the grid itself can optimize the resource allocation dynamically to fit the workload characteristics could make the data center more efficient and green.

Taking the same approach to the business services across the organizational boundaries, there could be an automated market place where the available computing resources could be traded by the public cloud providers and the consumers can “buy” needed computing resources in the market and get their processing executed by probably combining multiple providers’ resources on an extended hybrid cloud in a highly dynamic configuration.

The data and processing have to be encapsulated at a micro or nano scale objects, taking the computing out of current storage – processor architecture into a more connected neuron like architecture with billions of nodes connected in a really BIG bigdata.


If all the computing needed on this tiny globe can be unified into a single harmonic process, the amount of data that needs moving comes to a minimum and a “single cloud” serves the purpose.

Conclusion: Cloud management using bigdata, and big data running on cloud infrastructure complement each other to improve the future of computing!

Question: If I have a $1 today, where should I invest for better future? In big data? Or in Cloud startup??

Have a fabulous Friday!

Cloud Computing workshop slides

February 8, 2013

Cloud Computing – Foundations, Perspectives and Challenges workshop slides.

Present to BITES state level faculty development program at Mangalore today to around 60 faculty members from regional engineering colleges. 

Cloud prasad chitta_8th_feb2013 from Prasad Chitta

A journey through Grid and Virtualization leading to Cloud computing

January 11, 2013

In computing these are two trends I have seen crisscrossing throughout my career.
1.       Making a computing node look like multiple nodes using virtualization.
2.       Making multiple computing nodes work as a single whole called a cluster or grid
So, there was an era where the computing was really a “Big-Iron” and the computers are mainframe size and provided a lot of capacity of computing. One computer handled multiple users and multiple operations at the same time.  The Virtual Machine is included in operating system of IBM and the mainframes as well as from DEC VAX mainframes had similar concepts. Of late, we see the trend even in desktops with hypervisors like vmware etc., coming out.
With the advent of mid-range servers with limited capacity, there is a need to put them together to get the higher computing power to deal with the demand.  The first commercial cluster developed by DEC ARCnet even though there is always been a fight between IBM and DEC on who invented clusters.  Clustering also provides high availability / fault tolerance along with higher computing capacity.  Oracle was the first database to implement parallel server on ARCnet cluster for VAX operating system.
This trend of cluster computing has achieved supercomputing to break the complex task in to multiple parallel streams and execute them on multiple processors. What is the fundamental challenge in “clustering” – the process coordination and access to the shared resources. This leads to be locally networked and connected with high performance local network.
Another concept of this is grid computing where the administrative domain can connect loosely coupled nodes to perform a task. So, we have more and more cores, processors, nodes in a grid to provide low cost, fault tolerant computing. This is making smaller components put together to look like a giant computing capacity.
Finally, what I see today is “Cloud” which creates a grid of elastic nodes that look to appear like a single (large) computing resource and gives a slice of virtualized capacity to each of multiple tenants of that computing resource.  
Designing solutions in each of these technologies of big-iron, virtualization, clusters & grid and in Cloud world has really been challenging and keeps the job lively…

Enterprise Solution Architect

September 13, 2010

My teenage son sometimes wonders about what I do in the office to make my living….

Let me try to define the phrase “Solution Architecture”. To do so, I got to first define the words that constitute the phrase in the correct context.

Solution :

A SYSTEM continuously experiences environmental change! Change in the system environment is a natural process. Sometimes the change is gradual and other times the change is sudden.

In any case a system (an enterprise itself is a SYSTEM!) has to have a built in mechanism to tackle the change.

When a “change” can’t be handled by the system within its scope of operation, this change is called as a “problem” that needs a “Solution”

This is the tricky part of the phrase “Solution Architecture”!

Architecture is a discipline, a set of principles, methods, guidelines that can be used to create a “model” of a “system” from multiple view points of different stakeholders of the system. Once the model is constructed, it can be used to get a holistic view of the system by different stakeholders. It greatly helps the understanding of the problem and channelize the thought towards the challenge that is caused by the “problem”

Overall, the solution architecture is a discipline of finding the opportunities to improve the system gently to tackle the challenges posed by the environmental changes as well as making the system more “responsive” to future challenges by creating a multi-dimensional model of the system!

Enterprise Architecture

In the modern day every business organization is seen as a SYSTEM. So, Enterprise Architecture (EA) discipline is divided into four layers as

  1. Business, (what makes up the business – people, processes and tools; Its goal, structure and functioning)
  2. Data/Information,(what are the key data comes in and information needed for business functions both operational and strategic)
  3. Applications (How the data is captured, processed, stored and converted into the useful information and knowledge and presented to the right people at the right time!)
  4. Technology/Infrastructure Architectures. (What physical servers, network, storage components are required to run the applications within the budget meeting service levels)

With the “Cloud Computing Paradigm” is in, the business is seen as a loosely coupled services and each of the service can have three layered “clouding” in SaaS – Software as a Service, PaaS – Platform as a Service, and IaaS – Infrastructure as a Service. This cloud computing has changed the way we look at architecture in the Data/Information, Application and Technology/Infrastructure layers. An architect should consider the possible options of public (external supplier hosting the cloud), private (hosting a cloud within enterprise datacenter0 or hybrid (a combination of public and private) deployment models of the Cloud in these layers.

To make it simple, as an Enterprise Solution Architect, I draw different shapes and name them differently; I will connect those shapes to form a smooth looking flow between multiple layers of the enterprise and convince the key stakeholders that they have understood their current problem and different possible solutions to solve their problem. I will help them select the best solution option suitable for their budget and needs…..

It is FUN as I enjoy this!!

Enterprise Data Fabric – data grids

April 10, 2010

Enterprise Data Fabric or “in memory data grid” can improve the distributed or clustered application performance dramatically.

The Problem:
Distributed applications need to share the “application objects” across multiple processing nodes. As the application objects are not “relational” there is a need for the Object Relational Mapping (ORM) to share the application objects using a relational database.

The ORM technology involves converting the object into a set of relational (table/column) information and and turns slow.

The Solution:
Using an in-memory data grid to store the application objects on distributed multiple node cache and managing the transactions in once and only once manner.

This will dramatically improve the performance and scalability of the application.

This is also going to be a key enabler of the Private PaaS Cloud Computing in future.

IBM WebSphere eXtreme Scale and Oracle Coherence are the examples of Distributed Caching Platforms that provide Enterprise Data Fabrics or in-memory data grid solutions.

Let us see how this technology shapes up the future!

Platform As A Service – Private Cloud

February 17, 2010

Sometime back, Oracle has published this white paper.

This gives a clear vision of where Oracle is heading in the technological direction of Cloud Computing.

By standardization on the technology stack for enterprise application and fusion middleware enhancements, the PaaS seems the natural direction for enterprises in medium term.

Link to my past blog post on Private Clouds Here.

Private Clouds Again

June 17, 2009

As mentioned in the previous blog post on Private Clouds, the vendors are coming out with some products on this technology.

IBM launched the CloudBurst and HP BladeSystem Matrix.

All the three types of private clouds look promising at the current moment; IaaS, PaaS and SaaS. Service oriented, self-provisioning model of Cloud and the pay-as-you-go billing are key factors of success for Clouds.