It is now the time of the year when people who should know better try and make predictions about what will happen by end of 2012. For example many commentators are making predictions for; Will Greece still be in the Euro-zone? Will Obama win the election? Will China experience a property meltdown? To name but a few.
So although I should know better here are my three predictions for The Decision Model (TDM) in 2012.
The Decision Model Prediction 1
By the end of 2012 a low cost entry-level TDM modelling tool will exist that will enable modellers to create and validate that their TDM models comply with the TDM 15 principles. There is currently only one tool that has this capability and that is the heavy-weight Sapiens DECISION tool. DECISION is designed for enterprise TDM modelling, life-cycle management and governance.
There has been an intense debate on the Drools and jBPM blog with two posts by Mark Proctor http://blog.athico.com/2011/11/decision-model-ip-trap.html and http://blog.athico.com/2011/12/decision-model-ip-trap-part-deux.html.
At the same time another intense debate on the TDM patent was going on in “The Decision Model” Linkedin group. Mark saw fit to place of copy of the complete discussion thread, of a closed Linkedin group, on his public website against what I feel is the spirit of a closed Linkedin group of which he is a member.
Personally I respect Mark’s contributions to the open source community, the recent software patent that he was awarded (US Patent 7904402) and his significant contributions to the Drools project at Red Hat. And whilst Mark has sought to interpret some of my arguments in the Linked discussions as trying to “belittle him” this was not my intention.
It is not my intention in this post to provide a point-by-point rebuttal of all Mark’s arguments. What I hope to do is to outline the central plank of Mark’s arguments and that is his position on software patents, and, then look at some of the key arguments he has used to support his assertion that The Decision Model IP is a trap.
The world can be divided in to three groups, from a patent perspective; those who believe passionately that software patents should be abolished; those who believe that software patents should or will continue to exist, and, those who do not care either way.
Now Mark believes passionately that software patents should not exist. He also believes that until the day where software patents are abolished world-wide that the only people who have the moral right to own patents are open source software companies who will naturally use their patents defensively. Because they are open source companies.
There is nothing wrong in Mark holding this passionate belief.
The Decision Model (TDM) is one of those disruptive technologies that come along once in a while that has significant implications for business agility, designing business applications, operational decision making, business rules, process models, business requirements, business architecture and regulatory compliance – to name but a few.
Over the last two years large clusters comprising 1,000s of commodity CPUs, running Hadoop MapReduce, have powered the analytical processing of “Big data” involving hundreds of terabytes of information.
Now a new generation of CUDA GPUs based on the Fermi (see figure 1) and the Kepler (Kepler is due in 2011) have created the potential for a new disruptive technology for “Big data” analytics based on the use of much smaller hybrid CPU-GPU clusters.
The GPU (Graphics Prossessing Unit) is changing the face of large scale data mining by significantly speeding up the processing of data mining algorithms. For example, using the K-Means clustering algorithm, the GPU-accelerated version was found to be 200x-400x faster than the popular benchmark program MimeBench running on a single core CPU, and 6x-12x faster than a highly optimised CPU-only version running on an 8 core CPU workstation.
These GPU-accelerated performance results also hold for large data sets. For example in 2009 data set with 1 billion 2-dimensional data points and 1,000 clusters, the GPU-accelerated K-Means algorithm took 26 minutes (using a GTX 280 GPU with 240 cores) whilst the CPU-only version running on a single-core CPU workstation, using MimeBench, took close to 6 days (see research paper “Clustering Billions of Data Points using GPUs” by Ren Wu, and Bin Zhang, HP Laboratories). Substantial additional speed-ups are expected were the tests conducted today on the latest Fermi GPUs with 480 cores and 1 TFLOPS performance.
Over the last two years hundreds of research papers have been published, all confirming the substantial improvement in data mining that the GPU delivers. (more…)
Directors are always on the look out for ways to increase their revenues, reduce costs and mitigate their business risks. One of the principal ways of achieving these goals is to create a dynamic and highly responsive business operation using a number of agile technologies.
Directors are often being told that if they implement a single agile technology that they will solve the agility problem for their business. However whilst many businesses have achieved some levels of success in using a single technology (such as BPM or Predictive Analytics), this single technology approach is no longer viable in today’s challenging and competitive business environment.
There are currently six technologies that can be considered as the “Six Levers of Business Agility”. A detailed white paper on the Six Levers of Business Agility can be download from here).
Implementing these six business agility levers provides a substantial additional uplift in business performance over what can be achieved by a single lever.
The first step to achieving business agility is to “SOA-enable” your business applications. By “SOA-enable” I mean that the capabilities within each of your business applications should be delivered by a set of services in addition to the traditional application user interfaces. These services can be loosely coupled with those of other applications, to compose dynamic and agile business processes.
These dynamic business processes can be integrated with other business processes to create highly agile applications that enables your company to respond to changes in your business environment without requiring expensive and lengthy software development.
The good news is, if you have purchased enterprise application packages over the last few years then they should have been “SOA-enabled” using Web services technology. If they are not then get your vendor(s) to give you the service interfaces for your applications, this should be free if you have a support contact in place.
There is no doubt that many SME directors are examining the value proposition of deploying their business applications in the cloud to take advantage of the agile flexible self-service that cloud technologies can provide. So I am always on the look-out for new information on what it take to deploy secure applications in the cloud.
Just the other day I received a mail-shot from OpSource.net with an attention grabbing headline “There is no such thing as a Private Cloud”. So with raised eye brow I clicked on the email and found an invitation to a webinar by Phil Wainewright called “There’s no such thing as a Private Cloud” plus a link to a white paper called “Enterprise Meet Cloud: Mapping a safe passage to enterprise cloud adoption”.
I was impressed with the webinar and white paper and I would recommend them to any one looking to use the cloud to deploy mission critical business applications and are concerned about security issues in the cloud.
Note I have no connection with the guys at OpSource.net however I have been tracking them for a number of years now (before Cloud became the in-thing). They also offer a “white-label” cloud service for those companies who want to offer their clients a cloud-based offering without having to make the $multi-million investment in creating a secure cloud hosting infrastructure.
OpSource is certainly a company to watch.