While implementing enterprise SOA, it is important to consider deploying a service catalog for services. There is a lot of confusion between the concepts of registry, repository and service catalog.
Traditionally a registry has been a lookup service provided to service consumers. Service providers register their services in the registry and service consumers select an appropriate service for their needs. Standards such as UDDI addressed these needs.The Registry would contain service descriptions, service contracts and service policies that describe a service. Service registries have also been practically used for determining a service end-point address at runtime based on the service unique name.
So what is a repository? As the importance of SOA Governance grew, it became necessary to capture more meta-data about a service. A service repository integrates information about a service from multiple sources and stores it in a centralized database. Service information may include design artifacts, deployment topologies, service code repository, service monitoring stats, etc. Vendors have started positioning their generic asset management products as SOA repositories. For e.g. Rational Asset Manager.
A lot of vendors now sell a combined product that consists of the registry and repository. For e.g. IBM Websphere Registry and Repository.
A service catalog is a concept that can be implemented using SOA registry/repository products.
Wednesday, December 29, 2010
Monday, December 27, 2010
Entities Vs Value Objects
In Domain Driven Design, we often separate Entities and Value Objects. Junior architects always get confused between these 2 concepts.
The essential difference is that domain entities have an identity and a lifecycle. So each Entity has a unique identity and with a given domain, no two entities can have the same identity. Value objects need not have an identity. So if we have an "equals()" method that compares the parameter values of each value object, then we can have value objects that are identical. Value objects should ideally also be immutable.
The following links offer interesting stuff on this concept.
1) Lostechies
2) StackOverflow
3) Devlicious
The essential difference is that domain entities have an identity and a lifecycle. So each Entity has a unique identity and with a given domain, no two entities can have the same identity. Value objects need not have an identity. So if we have an "equals()" method that compares the parameter values of each value object, then we can have value objects that are identical. Value objects should ideally also be immutable.
The following links offer interesting stuff on this concept.
1) Lostechies
2) StackOverflow
3) Devlicious
Labels:
architecture,
DDD
Friday, December 17, 2010
TCO of applications during Portfolio Rationalization
In my previous blog post , I had narrated the process of portfolio rationalization. During the fact finding process, we need to calculate the TCO of an application. It’s a good idea to have a predefined template for entering all the parameters that add to the total cost of the application, i.e. hardware costs, software license costs, maintenance costs, data center costs, etc.
We should also try to collect the TCO statistics over a time period; i.e. over the last 3-5 years. This data when plotted on a graph would help us in identifying patterns and spotting trends. For e.g. if the TCO of an application is showing steep increase with every passing year, then we need to be wary of the “cost of inactivity”. Cost of inactivity means what will happen if no action is taken?
The TCO of applications should also be compared against the business value that the applications are providing. It may be that 70% of the TCO could be consumed by applications having 30% business value.
Another important dimension to capture would be the usage statistics and performance SLAs over the last few years. If the number of uses are increasing and the SLAs are not been met, then it’s time for some proactive action.
We should also try to collect the TCO statistics over a time period; i.e. over the last 3-5 years. This data when plotted on a graph would help us in identifying patterns and spotting trends. For e.g. if the TCO of an application is showing steep increase with every passing year, then we need to be wary of the “cost of inactivity”. Cost of inactivity means what will happen if no action is taken?
The TCO of applications should also be compared against the business value that the applications are providing. It may be that 70% of the TCO could be consumed by applications having 30% business value.
Another important dimension to capture would be the usage statistics and performance SLAs over the last few years. If the number of uses are increasing and the SLAs are not been met, then it’s time for some proactive action.
Labels:
Portfolio Rationalization,
TCO
Monday, December 06, 2010
SOA and BPM
Yesterday, we were having a discussion with one of our customers on the hot topic of SOA and BPM strategy, i.e. can SOA/BPM initiatives be combined?, what are the challenges, pitfalls, best practices, etc. Jotting down some of the key points of the brainstorming session.
- To start with, its important to realize that both BPM/SOA has a common goal - greater business agility and to align IT with business. SOA and BPM complement each other and the potential benefits are compounded when you have a unified enterprise wide strategy for them.
- BPM drives a process-centric thought process - right from design, implementation, monitoring and continuous optimization. BPM forces a paradigm shift from an application centric view to a process centric view. SOA is an architectural style where as BPM is a management discipline.
- A combined BPM/SOA initiative will do the delicate balancing act between incremental and transformational change. Also a combined initiative should enable stakeholders to decide what important processes need that extra agility and prioritize them to be re-engineered as services because funding is always limited.
- Top-down BPM appraoch drives the discovery of services since they provide important insights into understanding what parts of the IT portfolio can be exposed as SOA services. Thus BMP can provide a structured approach for identifying reusable business services.
- SOA services also enable faster integration in BPM as the need for custom integration touch points reduces and this in turn enables faster deployment of BPM. SOA also enables rapid change of business processes which is not possible if the business process in embedded in a lot of traditiona non-SOA applications. For e.g. when a process needs to change to comply with a new regulation or due to a change in business strategy, then a loosely coupled BPM process orchestrated using SOA services is easier to change. New services can be plugged-in or existing services can be rearranged.
Labels:
BPM,
Enterprise Architecture,
SOA
Thursday, December 02, 2010
SONAR tool
My team has been evaluating the SONAR tool to manage code quality. I was impressed with the features and the user friendliness of the tool. SONAR can be used for both Java and .NET projects. It has a open plug-in architecture that allows any code quality tool to be plugged in.
For example, for static code analysis it combines the power of popular tools such as PMD, checkStyle and FindBugs into an unified user interface that is great to use :)
SONAR also has support for free code coverage tools such as JaCoCo. Code coverage can be measured by unit tests or integration tests. You can even drill down to source code level - something I love to do :)
SONAR also integrates with new tools such as SQALE that have a formal approach for defining code quality in terms of maintainability, Testability, Reliability, Changeability, Efficiency, Security, Portability, Reusability, etc. Overall it is an invaluable tool to access Technical Debt.
For example, for static code analysis it combines the power of popular tools such as PMD, checkStyle and FindBugs into an unified user interface that is great to use :)
SONAR also has support for free code coverage tools such as JaCoCo. Code coverage can be measured by unit tests or integration tests. You can even drill down to source code level - something I love to do :)
SONAR also integrates with new tools such as SQALE that have a formal approach for defining code quality in terms of maintainability, Testability, Reliability, Changeability, Efficiency, Security, Portability, Reusability, etc. Overall it is an invaluable tool to access Technical Debt.
Labels:
Code Metrics
Subscribe to:
Posts (Atom)