Saturday, September 30, 2006

Why Open Source Will Rule the Software World

About five years ago a CIO that I worked for once told me that eventually all "for profit" software companies would be overrun by Open Source software companies. I thought he was absolutely crazy at the time. Sure Apache had made in roads in regard to the web server market share at that point and Tomcat was doing ok as the reference implementation for Sun's Java servlet container but... come on. But I did stop to think about it... This guy was a retired partner from a major consulting firm who was really working because he wanted something to do and had proven on many occasions to be a very insightful and brilliant man. Now, looking at the Open Source software world I am convinced once again that he may have been crazy... but in that "there is a fine line between genius and insanity" kind of way.

It looks like even world beater Cisco may not be safe from the pressures of Open Source. A project called Vyatta is picking up steam and really looks like its is going to bring some disruption to the otherwise boring networking gear market. Why is this simple little product disruptive to a 5 billion dollar industry? Well... its an open source router... with all the features of high end network equipment... at one fifth the cost. This product is currently in beta but due for release soon. In an recent article in Business 2.0 it was explained that this project was started because researcher Atanu Ghosh was studying the future of broadband and knew that to make changes to router software, he had to submit them to Cisco or some other slow moving network leviathan. So he and some colleagues decided to write their own, caught the eye of an early Cisco employee and some venture capitalists (there is an interesting note in and of itself... venture capital firms are now funding open source software companies...) and viola. This will be a fun one to watch.

Bernard Golden had an interesting series in his blog on CIO Magazine's blog site about why open source is not only a safe choice for companies but the smart choice. In the latest installment he takes an example of a company with 32 million in revenue that has chosen JBoss as their application server. They could have chosen BEA but instead of paying 500k they paid 50k. What happened to the other 450k? That 1.5% of their annual revenues would instead be able to go into product development, salaries, or any of the other things a growing company needs.

What has made this shift in perception and belief from open source being cool but for companies who could risk their own support to software that is available and real for everyone? My opinion is that the ecosystems for open source has changed. Companies are seeing both the financial benefits of open source. Really, off the shelf software has always been a strange model with all costs in the construction and after that it essentially being free. (maintenance, support etc obvious exceptions and in custom software where the majority of the costs actually end up) Now you can download the software, see if it works for you, if it does you can get support from third parties if it is needed or even participate in the open source community as a company for the support you need. (Who would you rather have answering your email for support, a call desk person who knows what's in a system from the manual and has a series of steps to obfuscate real help or a developer who develops code in the system as part of their job?)

Friday, September 29, 2006

Symphony separates good developers from great developers

I have often observed that some of the best developers I have met have a background in music. I have done a bit of thinking about why that could be and I have a postulation.

When you learn to play music you learn how one item by itself can make music. Closely after that you learn how adding additional instruments adds a whole new aspect to the music. Then, when you add more instruments, of different types you get yet another aspect. One after another the symphony builds the whole becoming greater than the parts.

I think that it is this appreciation for the connectedness of seemingly unconnected items that makes developers with a music background able to make better code. Of course just as not everyone who has ever picked up a violin is a stellar developer the understanding of symphony is not limited to those who have played an instrument. The interrelation of things is something that anyone can learn and observe.

Wednesday, September 27, 2006

When you are going through hell keep going

I heard a construction worker make this statement and it struck me as so profound that I thought I should write a post on it. It is often said that the hardest part of any journey is taking the first step. This can be seen in almost every approach to time management.

Step 1 - break the problem into small, easily consumable chunks.
Step 2 - get started.

But into every perfect plan some problems will fall and how we react to these problems defines who we are. John F Kennedy once said "History will never accept difficulties as an excuse." So we need to make sure that we adapt, move forward and keep things going. In business and life you are not successful if you try really hard and have good excuses why you couldn't get what needed to happen to happen.

If you think about it, the people that we all like to work with the most are the people that have positive energy. They know what they want to get done and choose to make the best and most out of every day. They are grounded people who know who they are and are determined to meet their goals and they can be anywhere in an organization..

I am a big believer in the sentiment of if you never break anything you are not doing anything of value. A good chunk of the long term term success of that statement though is that you don't give up. Thus the profoundness of the statement... if you are going through hell you keep going. To me this equates to sticking to your guns and not giving up because things get hard. It doesn't mean that working 120 hours every week is ok. Sustainable pace is necessary and we need to plan the right resources, phases and release plan to make sure things go right.

Tuesday, September 26, 2006

SOA Day 7 - From Object Oriented to Message Oriented

Day 1 - Connections = Cost to Connections = Value
Day 2 - Function Oriented to Process Oriented
Day 3 - Build to last to Build for change
Day 4 - Prolonged Development to Incremental Deployment
Day 5 - Application Silos to Orchestrated Solutions
Day 6 - Tightly Coupled to Loosely Coupled

This is the last in my series of mental shifts. Hopefully people found some value in them rather than me just preaching via electrons. I truly believe that if we take these simple shifts to heart we will have better overall systems not just a good SOA architecture. That said, on to today's post... shifting our mindset from Object Oriented to Message Oriented. This is not to say the Object Oriented programming is evil but rather SOA is an offshoot from Object Oriented development focused on integration of components and communication between them and as such needs to be approached differently.

As we start to change the other aspects of our mindset we will be required to change our thinking from specifics of application development to how application components will communicate and connect. Since we will have loosely coupled the application components we will need to think about how we will message between them. In order for orchestration to work we need to know what will be sent in the communications between components. Since we will have components that will evolve incrementally we will need to build messages that will evolve gracefully as things change.

By shifting this component we will be able to safely perform actions that before required synchronous communications to those that are asynchronous. With this shift we will be able to take advantage of queues, allowing our application components to consume and compute as fast as they are able. We will be able to maintain protections of components and between components because it will be built into the architecture.

Obviously none of these things are silver bullet's but are again steps towards allowing higher level computation once again. Just as Assembly gave way to C which gave way to C++ then to Java as languages so too will our application communication metaphors continue to evolve.

SOA Day 6 - From Tightly Coupled to Loosely Coupled

Day 1 - Connections = Cost to Connections = Value
Day 2 - Function Oriented to Process Oriented
Day 3 - Build to last to Build for change
Day 4 - Prolonged Development to Incremental Deployment
Day 5 - Application Silos to Orchestrated Solutions

Today's mental shift is one that I have blogged about before in other forms; shifting our mindset from Tightly Coupled to Loosely Coupled. Coupling is another word for dependency, specifically the degree to which a program module depends upon other program modules. A certain degree of dependency is obviously going to occur within programs that actually provide value. The key is to make it no more than absolutely necessary.

This is another point that builds off of Day 3. By loosely coupling code we will be better able to "roll with the punches" or shift along with the requirements as they change. Good architectural design abstracts out the bits that don't need to be specific and keeps the coupling from being overly tight.

With SOA this is even more important because one of the classic issues that is seen with services is versioning. One version changes just a little bit and causes a ripple effect. There are patterns and best practices that mitigate this and... what do you know, they are focused on reducing the coupling and providing loose version ties.

Libraries and functions become services in themselves. Using orchestration to pull the pieces together we are then able to adjust simple properties rather than needing to rewrite and recompile code. By keeping the coupling loose systems can use underlying technologies that work the best for the application instance at rather than forcing separate components to be tied to the exact same approach. Of course, for all of this to work a shared semantic framework is required to ensure that messages have consistent meanings across them.

Sunday, September 24, 2006

Day 5 - Application Silos to Orchestrated Solutions

Day 1 - Connections = Cost to Connections = Value
Day 2 - Function Oriented to Process Oriented
Day 3 - Build to last to Build for change
Day 4 - Prolonged Development to Incremental Deployment

My next platform upon which I am pushing for mental shift is fundamental to several areas both organizationally as well as for projects and management. This shift is from Application Silos to Orchestrated Solutions.

It is tempting to think about applications as separate silos of specific functionality focused on particular business value or individual function type. However to really take a step forward we need to shift our thinking. Components that do searching require input and provide output. Components that control inventory take input and provide output.

If we simplify the components to their base functions and then orchestrate these solutions we enable true reuse. Components can be very specialized and it is the hooking together of the various pieces that will enable the overall solutions. Orchestration solutions are everywhere these days. Whether it is an Enterprise Service Bus or just simple wiring together of services (personally I think in many cases an ESB is overkill but as always, it depends upon the situation. Heck, there is even Open Source ESBs out there now such as Open ESB on java.net, Mule and ServiceMix from Apache as well as the zillion dollar solutions.) pulling the pieces together allow for complete solutions to be built.

This type of change is difficult to implement because there are many things that need to be specified, handled and even governed in order for it to operate smoothly. For example, when a standard messaging type or service setup is created in order for these shifts to work consistently everyone must adhere to them. You can't have someone going off and building a different service approach because they don't like the current one or didn't feel they could comply and hit a timeline.

The benefits of this approach though are many. You can really take advantage of reuse with this since the components by definition are reusable. Data that would otherwise drift to slightly different implementations behind different systems is kept consistent. The ability to get things done with a controlled budget is enhanced simply due to the reduced maintenance. Maintenance consumes most of any system budget, ironically with a successful system the longer it has been around, the more the maintenance costs become.

So while this shift requires a lot of effort, if the standards can be created and adhered to the value truly comes forward.

Saturday, September 23, 2006

SOA Day 4 - Prolonged Development to Incremental Deployment

Day 1 - Connections = Cost to Connections = Value
Day 2 - Function Oriented to Process Oriented
Day 3 - Build to last to Build for change

Today's mind shift is to shift how we develop and release. Prolonged Development to Incremental Deployment . If you have been listening to the Agile speakers, reading software development papers or news or otherwise been paying attention chances are you have heard this one mentioned. This mental shift is one that is under way. Iterative development has been in place for some time and really has good traction.

My point on this one is we can't stop with just iterative development. The best feedback that we receive is when something is in use. In order for it to be used it needs to be in production. So this means we need to develop things in verticals of functionality that allows us to get them out and in front of our customers. A wealth of ills can be addressed by this. It does require a more deliberate approach and understanding of your overall application architecture. With this though better quality can be released and built.

Just like a fractal grows and grows making the picture more and more complete so does a SOA based architecture. Each bit released enables other bits to be built or released. By releasing these bits in a planned manner as quickly as possible rather than taking months to develop the "complete solution" we can steer our development to what is really needed. Just as I said in Day 3 where we Build to Change we can accept that our requirements are going to shift and be ready to adapt them faster. Focus on business value not the "complete solution". Complete solutions are a fallacy because if there is business value in a developed system more development will be needed.

Sunday, September 17, 2006

SOA Day 3 - We have to shift our mindset if we want successful SOA

Day 1 - Connections = Cost to Connections = Value
Day 2 - Function Oriented to Process Oriented

Continuing the series of mental shifts today we will cover how we build things and the development approach needed.

From Build to last to Build for change. This is another one that people struggle to get out of. It is tempting to think that we are building a system or component that will last forever, that it will not need change. To support this we tend to build in "what if's" and architect a solution until we are blue. What we really need to do though is focus on building in abstraction layers and flexibility, accepting that the system and requirements are going to change.

An interesting item in architecture is that those requirements that have the most churn about them are the ones that need the most flexible designs. A good portion of this flexibility can (and should) be built in through abstraction. If it doesn't matter what whatchamadigger you have behind the library because the library is generic then it gives you the ability to change along the way. This goes for both custom/internal code as well as external components.

A good example of this work the various MOM efforts everyone seems to be working on. Currently they are working on building abstraction libraries for several potential message queue systems. Websphere MQ, JBoss Messaging and SeeBeyond are all potential solutions and rather than building to one specific solution a better approach is to build interfaces that abstract out which mechanism is used behind the scenes. In this manner the teams who are doing the integration don't need to worry about the plumbing and can focus on the application logic needed to differentiate applications instead of the things that should "just work". Another major benefit of this approach is that it frees teams to move to new and different technologies with minimal rework and this is where real negotiation power with vendors and true flexibility in technology by avoiding lock in can be found.

SOA Day 2 - SOA Mindshift - Function over form Process over procedure

Day 1 - Connections = Cost to Connections = Value


Day 2 in the series of SOA mindshifts we need to make in order to ensure a successful SOA architecture.

From Function Oriented to Process Oriented. Thinking about functions is usually easier since that is what we break things down to from a development process. The subroutines that pull together a function makes it possible for us to take things in smaller chunks. Functions though are not what provides a business with value. Any paticular function is only valuable if it enables the business processes in which it sits. If we can make ourselves think about the process rather than sub-optimizing on the functions we can drive a better overall solution because we see the bigger picture.

This point hits many different areas. If we consider logging as an example we would rather know that a process path for booking has gone down than that a specific function within that path has gone down. With the information at the business process level we are able to know what other business processes are impacted. As technologists we want to know what piece is breaking but really that is part of the detail of fixing the problem... the business wants to know impact. Other areas are similarly impacted.

Another benefit of this mindset change is that we will naturally start to take a few steps back when modeling out problems. In the "can't see the forrest for the trees" sense it becomes easier to see the forrest because we are looking at the process and how the trees interact rather than studying the bark and how it is brown. Focusing the whole picture is important for many reasons.

Mindsets must shift if we want successful SOA

In my next series of posts I will propose a series of mental shifts that must occur in order to allow effective SOA to become a reality. Each of these will be proposed in a From X to Y manner over the next several posts. Let me know with a comment if you if you think I am crazy or if these make sense.


From Connections = Cost to Connections = Value. For a long time we have had the mentality that additional connections drive cost. In an SOA environment this changes. The more connections we have the more valuable our overall solution becomes. (My informal polls tell me that this is the one that the most people struggle with switching).


This point can be seen in my earlier point on complexity being a math problem. The more and easier it is to connect the more viable business models, execution methods and doors open up. Coase Theorum has been hard for some to accept as meaningful for technology conversations because they struggle to see a connection as anything other than a cost. In the mainframe world where there was a limited number of connections this was obvious. Now that connections are as ubiquitous with the proliferation of the intenet and IP connectivity the equations have changed.


Tuesday, September 05, 2006

A gap between stimulus and response

Not to long ago I was introduced to the concept of The Singularity. The idea behind this is that eventually computers will improve to the point that they can be smarter than human beings. Essentially this is when computers reach the point that they can improve themselves. The idea of the singularity is that at this point it becomes impossible to tell what will happen next. A similar theory called the "empty planet syndrome" involves nano-technology and a similar sort of improvement to a place that we can not comprehend and with a thought we could wipe ourselves out.

Of course there is debate about this and it's possibility. Movies like I Robot or The Matrix and others of their ilk point to a rather scary future. On the other hand though I think that there are several reasons why this won't come to pass.

My favorite reason (yes, I have favorites in my own head) is that computers are like incredibly powerful left brains but they lack right brain ability. Most fundamentally they lack the ability to truly choose their own actions. This choice is why someone who grows up in a family with a long long history of "specific problem X" can overcome their biological programming and choose to become something else. In the end a computer will only respond to the ones and zeros at the heart of it's algorithms. In AI research it is interesting to watch how once a new and great AI algorithm is created... it's not longer AI... it's just an algorithm... just code. Humans have a gap between stimulus and response and thus can choose their own actions.

Computers handle logic and rules and conditions with amazing ease. This is why Deep Blue was able to beat chess grand master Kasparov. While in the early days Kasparov was quoted as saying he would rip the computer to shreds he inevitably lost again and stated that computers would continue to win. Just as John Henry was beaten out by a machine queuing up the industrial age so to did Deep Blue usher in the information age. All of that wondrous tech aside... computers still can't recognize a face. (Algorithms are being built that key off of facial features, voice tones and other previously right brain lock activities and they will continue to improve but the basic limits are still there.)