Monday 19 December 2011

A Christmas Case Definition


How do you define Case?  How many times have I been asked this question in the past year?   Forget “Adaptive” or “Dynamic”, people just want to define this thing they are trying to automate.   And the more I think about it the distinction is important, why because in the definition you begin to distinguish some key elements that are what make a case unique, specifically in BPM terms and therefore what must be defined in order to achieve a robust and flexible technology solution.



A case is a collection of data that must be evaluated against organisational policy or procedure to determine the appropriate outcome. Following this the key elements of case are therefore;



  • Data paper or electronic docs or electronic info entered into system
  • Policy = Business Rules SLA and the appropriate role to apply that is empowered to make that decision
  • Procedure = Route or process the case must follow
  • Outcome = decision result


In summary what we are looking at are a collection of attributes Routes Rules Roles Reports and Results.

And that is my final blog about case…..at least for 2011 have a happy Christmas one and all.

Friday 16 December 2011

BIMIXS is a really cool tool for managing ad hoc processes, I can't see it being a market leading product, but I do think Other vendors should bring this type of functionality into their BPM solutions.

Wednesday 7 December 2011



My erstwhile companion (and boss) Bob Scott apprised me of the  fact that not only had Kofax software purchased Singularity a Microsoft focussed BPM solution, but also that  Progress  have bought  Corticon rules engine.  A perfect match in my opinion, every BPM tool should have a robust rules engine, not just an OEM add on or primitive routing only capabilities. Indeed the  BPM market consolidation continues apace and as Neil ward Dutton says in his latest editorial this aggregation in the market will only continue through 2012.


This marriage of BRMS and BPM  started many moons ago Capgemini, implemented a now revolutionary solution for the national assembly for Wales using Ilog and Filenet. to automate the assessment and payment of rural farm payments.   

My prediction for the future is that more and more applicaitons will seek to abstract business logic in separate tools, where they can be accesses, created updated and amended as necessary in natural (or close approximaiton of ) language like oracles   OIPA  solution. Rules and process management, will become like database management is today, as nautural to us as mom's apple pie.
 

Tuesday 6 December 2011

I stumbled on this linked in discussion on my favourite topic Case Management What is it? and felt complled to respond. I know it's going to get me in trouble but sometimes you've just got to stand up and be counted.  I'm repeating my response here for wider viewing and discussion.  

I whole heartedly agree with John (Pyke), Case management has been around for donkey's years, the old workflow tools did a reasonable job (if configured correctly) at case management and provided significant improvement to organisational processes.

There is no doubt that the tools we have today are more sophisticated and allow us to deal with ad hoc events within a process. That does not make case management new it means we have better tools available to help us manage the same requirement.

I weary of vendors trying to explain how new this is. I constantly have to remind my clients that a case is still just a way of an organisation dealing with an event insurance claim, complaint, customer on boarding, change of address etc in a consistent manner, to achieve some specified objective. Processes after all are graphical representations of policy and procedure.

Often the customer wants to do something that doesn't fit prescribed rules and policies so and yes dynamic capabilities allow us to capture those actions as part of the process. Olivia Bushe of Singularity states that Dynamic Case Management (DCM) is new because we provide knowledge workers with information, or because DCM solution cross multiple industry sectors. I have implemented Case Management in CRM solutions and knowledge management systems before that and workflow solutions before that.

Just becasue we can apply this solution across industries sectors is a pretty weak argument, i have implemented pre DCM solutions in every sector from Central and local government, utilities, oil and gas as well as in financial services.

Many vendors say that case management is best suited to unstructured ad hoc processes. This is nonsense structured processes like insurance claims have been handled by BPM and workflow tools and CRM applications for decades and they do a great job of bringing the right information to the right person and the right time. DCM I admit adds the capability to deal with the adhoc more elegantly but that is it, everything else about DCM is no different to what went before.

We have been applying process to cases (i.e. a collection of information about an event. for many years using all manner of different solutions. So trying to define adding process to case as a distinction is meaningless.

Pabo Trilles lists a number of must have requirements for a DCM solution ;

 

·         Management by BPM Processes
·          Business Rules Management (not to be confused with Process Rules)
 Agile management of Documents and Web Content
·          Elements of Information, Communication and Collaboration between employees and with people external to the entity
·          Processes with the ability to deviate the flow at any given time, to other processes (with or without return)
·          Management of mandatory tasks, both planned and unplanned, with Dynamic Forms capable of appearing and being hidden according to the circumstances
·          Agile creation of additional steps for the element control of the cases
·          Tools to observe, control and analyze the execution of each case as a whole, as well as analyze the combined results of terminated cases for continuous process improvement

all but two of these, management of mandatory tasks and agile creation of additional steps, is absolutely no different to a workflow solution of the 1980's. For example the CAPM project for the national assembly for Wales automated the process of rural farm payments with a workflow solution and a rules engine. Way before PRPC. This was an award winning solution; ground breaking given that it was one of the earliest use cases of using Ilog with Filenet.

In older solutions dealing with adhoc requests meant routing a task to a "knowledge worker" who was empowered to make a decision. They might have to consult other co-workers and get agreement on the resolution in an email but it resolved the issue. Today presumably that knowledgeworker would use the BPM solution to create the steps to route the request for the exception to a colleague instead of using email, making it part of the process, visible, auditable and reusable if this event occurs again in the future.

For those stating that DCM is a fusion of BPM and ECM what do you think the imaging and workflow solutions were doing? Filenet and Documentum and a host of others have been fusing workflow with document management for quite some time.

Thursday 1 December 2011

Keep it Simple Stupid



I like Sandy Kemsley, she always gets it just right, BPM projects often turn into long drawn out affairs because of scope creep or too many conflicting priorities.  In a recent article  Keep it Simple  Sandy gives some practical advice on how to cut complexity out of BPM programs.  

In addition to Sandy's sage advice I would add that it is really important to keep those first projects simple in terms of integration points.   

Too often projects turn into nightmares when they go integration mad on the first delivery, when development teams are simply just getting use to the tools and ways of working. 

You need a strong project manager or process owner to stand up to the business and keep the scope manageable and small and break delivery up into leaner more manageable chunks.

What makes good BPM Governance ...



Alan Earls recently published a Good lists to what comprises good BPM governance, to this list I would also add;

  • Ensure processes have an empowered process owner. 
  • Ensure process initiatives deliver on strategic objectives 
  • Have a clear set of over arching principles, this will help to prioritise projects 
  • Have a governance design authority a mix of business and technical to advise and support process owners and oversee projects 
  • Have a consistent methodology for delivering projects 
  • Promote reuse of methods standards and processes applications and services
I would also ensure that there is a clear enterprise architecture (EA) framework that supports your BPM initiative. Something like IAF or TOGAF can seem heavyweight but can save you a lot of time and you can / should implement the EA incrementally, project by project as you do with your BPM initiative. 

Finally, someone who is taking a holistic view of collaborative BPM. See Paul Liesenberg's article on Architecting Collaborative Applications. Too many articles have focussed on Collaborative or social BPM as being just about creating the business process with some web 2.0 gimmickry.  


In fact as you quite right say, entire processes are about collaboration across people, departments systems and even organisations. BPM supports and is an integral part of a collaborative architecture. I am currently editing a piece on Collaborative BPM by one of my colleagues Marc Kerremans who talks about this trend. 

Sunday 13 November 2011

BPM and Adaptive Case Management their one and the same I’m afraid…


I feel like I need to rant, and I have been feeling this way for quite while, infact, ever since we started talking about Dynamic Case Management (DCM) or Adaptive Case Management (ACM) or  whatever 3 letter acronym you like to call it, came on the scene. 
The reason for my acute irritation is the pious articles proclaiming this new paradigm in resolving  this hither too unknown work pattern. My contention is that Case Management, DCM or ACM is not new, it is completely within the definition of Case Management, I’ll concede that it has some specific characteristics that can be distinguished form traditional Case Management but in my view these are semantics and reflect the improved capabilities of BPM solutions to manage an ever broader range of work types that have always existed.
I wasn’t going to write this blog because I knew I’d become emotional however, after reading Bruce Silver’s piece on BPM and ACM earlier today, I just started blogging.  Bruce starts off by setting the stage “ can a BPMS do a good job with case management, or do you need a special tool”. He then goes into a discussion of BPMN.  How the latter relates to end user I’ll never, know, in the last five years of implementing BPM solutions none of my customers has ever begged me to use BPMN. Many have asked if it is necessary or a good standard to which I have replied, standards are always good but as long as you have a simple and agreed modelling notation that the project team agree on them really it doesn’t matter too much.
My real bone of contention however, is that case management has always been the sweet spot for both workflow and case management solutions.   Filenet, Documentum and Staffware made millions from implementing solutions for insurance companies that allowed them to program quite rigid processes into workflow solutions providing better control and visibility.   BPM tools with their improved integration and model driven architectures allowed simpler, richer and faster process design, providing real time reporting.  These solutions  raised the game for process automation causing the insurance companies who had previously invested in workflow solutions to quickly adopt the new BPM paradigm. 
There was never any question, in the mind of customers, that what the new BPM solution provided was improved Case Management.   It’s only when the pundit’s and no doubt the BPM vendors came along with new capabilities to sell, that we began to muddy the waters with definitions of Case management or more specifically ACM and DCM, that meant that you could now handle so called unstructured processes, allowing knowledge workers to change their processes on the fly, to route work to a new role or create a new activity not previously defined in the process.
For me processes have always lived on a spectrum from unstructured email conversations to highly structured insurance claims handling.  A good example of unstructured processes are incidents, like an oil spill, or a call centre agent receiving calls about a faulty car where the brakes fail inexplicably for no reason. There is no prescribed process and the knowledge workers in these situations must figure how to resolve the issue, usually by discussion amongst a team of individuals who derive a policy with guidelines and a set of rules.   The resulting process emerges through the interaction and collaboration.  This process may never be documented, for example where the likelihood of the incident occurring a gain is so remote that it doesn’t warrant being codified in a programmatic solution.
Another example of unstructured processes are where the knowledge worker seeks to resolve an out of boundary condition, i.e. although it is an insurance claim the existing rules don’t specify what to do for this condition, or the knowledge worker may want to treat this high spending customer definitely to retain their business, but the rules don’t allow it.  In order to resolve the case the knowledge worker must use their discretion, often by consulting with managers or if authorised using their own initiative, to determine what the appropriate course of action should be. In effect create an exception process on the fly.   In this latter example the process may be codified and written into policy such that all customers meeting this criteria can be treated in a similar manner.
Real adaptive case management allows for this exception handling to be codified on the fly, so rather than resorting to email outside of the process to agree the change. The knowledge worker can create the steps that route the work to the relevant individual or alternatively create a new  rule or even a data item to record the new decision type, rather than just record the action in the “notes”. 
The fact that current BPM solutions can  allow you to manage these latter scenarios, is proof positive that this is just an extension of case, the new steps can be included as an “alternate flow”.  What I haven’t heard from any quarter is that you cannot use a BPMS to handle these ad hoc situations. In fact most vendors are falling over themselves to explain how elegantly they can handle these new use cases in the latest version of their solutions. 
And this is why I get so angry, because we are talking about processes, cases that either need to be captured as a  one off or need to provide the user the ability to refine the process on the fly however, in both scenarios we are still talking about a case. When the knowledge worker must create new activities, policies or rules within a process this is just an extension of platform capability, completely within the bounds of the latest BPM solutions. 
So just for the record, processes of whatever type; ad hoc, structured or unstructured have always been with us, BPM and workflow solutions have always been able to manage them. Traditionally this was only in limited ways, by codifying them when the rules and activities became stable agreed policies and procedures.
In today’s world BPM solutions can handle more dynamic processes, by allowing knowledge workers, where authorised,  to create new activities (policy and procedure), on the fly at run time.   They are doing this with the same BPM, all be it enhanced, solutions that provide essentially the  same core functionality, model driven development of processes, rules , SLA’s and Ui’s all without recourse to writing  code.  Yes the functionality has been extended somewhat, the complexity hidden away so that users can create more robust and compliant solutions more easily.   They are still cases and that are still managed. Right I feel better now that I’ve got that off my chest, back to strictly come dancing!

Sunday 30 October 2011

Mastering  Data ,Where do processes fit in?

A failure to address service-oriented data redesign at the same time as process redesign is a recipe for disaster. Michael Blechar  
On recent trips to client sites over the summer, the same problem has arisen regarding enterprise Business process Management (BPM )programs. I have repeatedly been asked the question “How do I achieve one clear version of the truth across multiple applications?  How can I avoid complex data constructs in the BPM solution and do I need to build an enterprise data model to support the enterprise BPM program?  It took me a while to piece the answers together, but a chance conversation  with some colleagues about Master data Management yielded some pretty interesting results. But before I launch into possible solutions let's take a step back and ask, “How did we get to here”?
With their adoption of open standards, SOA, SOAP and web services, BPM solutions are excellent vehicles for creating composite applications. Organisations have naturally grasped the opportunity to hide the complexity of applications silos and eliminate error prone time-consuming re-keying of data.  However they have embarked on these enterprise programs in a piecemeal fashion, one project at a time, often with no clear vision of their target operating model also ignoring what Michael Blechar calls siloed data.
Creating a single unified interface that contains all the data needed to complete a process can greatly contribute  to a seamless and streamlined customer experience, meeting both employee and customer expectations. I have seen many examples of these apps in recent years.
However, customers who had embarked on enterprise BPM programs were finding, far from having simplified and streamlined processes, they were increasingly coming face to face with the problem of data. Data from multiple sources – different systems and differing formats, sheer volume – was not so much the problem as was duplication.  While they are great at integrating and presenting data, with few exceptions BPM solutions are much less suited to managing data.  How so?
In today's heterogeneous application landscapes organisations have multiple systems that hold customer, product, partner, payment history and a host of other critical data. First wave BPM implementations typically integrate to one or two systems and the problem of master data management is hidden. However, as the second and third wave of process implementations begin and the need to integrate with more systems carrying the same entities,  the problem of synchronising the update of customer address or product updates becomes acute.
While BPM tools have the infrastructure to do hold a data model and integrate to multiple core systems, the process of mastering the data can become complex and, as the program expands across ever more systems, the challenges can become unmanageable.  In my view, BPMS solutions with a few exceptions are not the right place to be managing core data[i]. At the enterprise level MDM solutions are for more elegant solutions designed specifically for this purpose. 


 So back to those customers I visited mid summer.  I was asked, "how do I manage data across upwards of 30 systems, should I build an enterprise data model?  It turned out that, a previous BPM project had tried to build an object model in the BPM platform, the model was both complex and difficult to manage. Worse still, the model was already unstable and did not yet account for the data structures across the multiple systems in the enterprise.
Intuitively I felt that trying to produce an enterprise data model was the wrong approach.  Having a history of data modelling I knew that this could take many months to get right in a complex organisation and would ultimately delay the transformation program.  Almost certainly if, the model was actually completed, the likelihood of it being accurate or relevant was pretty low.
This led me to investigate where we had done this type of thing before and I found that on our most successful account we had encountered the same problem and engineered a technically brilliant but highly complex solution that was again difficult to manage.  I began discussing the problem with my colleagues, Simon Gratton and Chris Coyne, from the Business Information Management (BIM) team, who had also encountered the same problem, better yet; they had begun to address it.
As we discussed the issue – they from a data perspective, me from a process perspective – a common approach emerged. The biggest problems of enterprise programs were that they failed to start at the enterprise level, and those initial projects focussed on too small an area, with the understandable and perfectly valid reason to get that “score on the door”. However when going further the business architecture or Target Operating Model (TOM) "provides a vision and framework for the change program; it acts as a rudder for the projects, helping the individuals involved make good decisions.” Derek Miers and Alexander Peters, without this rudder, projects can all too quickly run a ground.
This operating mode, also specifies the strategy and core processes that will support the new operation, these processes will imply a set of services (create customer, update address) and a set of data objects that are required.  Once defined at this logical level the technical services and reusable components can also be defined.  The core data objects once identified should be defined in a common information model that will support the core processes.
Working on a number of engagements the process has been refined into the following steps.
1         Create TOM with a clear vision and strategy
2         Defin the business operations and core  business processes to deliver the strategy
3         Select 4 or 5 processes and  identify the data entities required to support them
4         Create a common information model (CIM) with agreed enterprise definitions of the data entities
5         Identify reusable services that create, update or amend  the entities (these become reusable business services)
6         Build a single process  project 1 and populate the CIM with general attributes*
7         Incrementally develop the CIM over the life of the program building in additional entities required to support new processes
*Please note: some data attributes will be very specific to a process (e.g. insurance rating engine) which can be created only in the BPM solution. However, other attributes and entities will be applicable to the enterprise and so should be stored in the CIM.
 Acknowledgements
I would like express heartfelt thanks to my colleagues and peers who through lively discussion, debate and violent agreement helped refine the ideas expressed in this paper.
Chris Coyne
Simon Gratton
Fernand Khousakoun
And external sources
Derek Miers and Alexander Peters
Master Data Management Enables BPM and SOA  Michael Blechar Gartner June 14, 2009



[i] BPM Solutions like Cordys and Software AG have BPM solutions fully integrated into their suites.


Thursday 15 September 2011


Making Legacy Agile Controlled Migration

Controlled migration uses BPM solutions to "wrap" around ageing, inflexible, IT assets. The processes within these assets are exposed in a more flexible agile process layer. The processes can then be adapted to changing needs of the business without having to "rip and replace" core systems.
The problem with CRM, ERP and older mainframe applications is that they were not designed to be agile. CRM solutions are great data repositories but they support processes poorly, with users needing to navigate to the right screen to find the piece of information they need. Worse still, making changes to the original "data views" was a difficult, time consuming and costly exercise. 
ERP Solutions fared little better as they were designed with a "best practice" process already installed. If the client wanted to amend or adapt this process it became a difficult, costly and lengthy exercise. Mainframes fare no better and were never designed to cope with today's fast changing process centric environments. Many applications were product centric and dictated a rigid hierarchical menu driven approach to accessing data and functions.
Other proprietary workflow or off the shelf products  were never designed for integration and therefore became isolated islands of functionality and the only way to access them was to rekey information form one platform to the other, with the inevitable miss keying errors and delays in processing that result.
Modern enterprises are multi product customer centric organisations that require knowledge workers to execute multiple customer related tasks at the same time, with a backdrop of aging IT infrastructure, the challenges are huge.
By using a BPM solution like Pega Systems PRPC, the process is primary and the application is designed to present the right data to the right person, at the right time. The system guides the user through the process, as opposed to the user having to find the data or remember what to do next. Processes, products and even new applications can be added to an application in weeks or months rather than years.
Finally controlled migration is about protecting assets. The ability of BPM solutions to integrate more quickly with critical systems across the enterprise, to expose hidden processes and  to facilitate change and adoption. By exposing functionality in the process layer organisations are able to extend the useful life of their IT assets which remain as systems of record or transaction processing engines. Over time more processes and business rules can be transferred to the more agile and flexible process layer while the underlying system can be safely retired or replaced. This "controlled migration" can be implemented with impact to business as usual.

Monday 5 September 2011

MDM and Business Process Management


The convergence of BPM and MDM is nothing new Products like Cordys have featured MDM components for some time while BPM vendors such as Software AG (when they acquired software foundation's) and Tibco also include MDM functionality as parts of their BPM stack. " business processes can only be as good as the data on which they are based" according to Forrester analysts Rob Karel and Clay Richardson in September 2009 they noted that data and process are as inseparable as the brain and the heart. Karel and Richardson went on to emphasise that;
…process improvement initiatives face a vicious cycle of deterioration and decline if master data issues are not addressed from the outset. And MDM initiatives face an uphill battle and certain extinction if they're not connected to cross-cutting business processes that feed and consume master data from different upstream and downstream activities.
Richardson also noted in his review of Software AG's acquisition of Data Foundations "The only way master data can reduce risks, improve operational efficiencies, reduce costs, increase revenue, or strategically differentiate an organization is by figuring out how to connect and synchronize that master data into the business processes…." Clay goes on to say …..With this acquisition, Software AG acknowledges that the customers of its integration and business-process-centric solutions have a strong dependency on high-quality data. This move reflects a trend that we have identified and coined as "process data management," which recognizes the clear need for business process management (BPM) and MDM strategies to be much more closely aligned for both to succeed …"

This observation is quite true, more recently two clients have posed the question of how to manage data across the enterprise when embarking upon a large scale BPM program. They have realised as Karel  did that "To Deliver Effective Process Data Management...  data and process governance efforts [will need]to be more aligned to deliver real business value from either."

The often discussed promise of adaptive processes or as Richardson remarked the ability to make "....processes much more dynamic as they're executing……. processes reacting to business events and able to adapt in flight." In order to make this happen clean accurate date is critical, other wise your processes are going to be adapting to unreliable or worse inaccurate data. The challenge according to Richardson is getting BPM and data teams to work together, research suggests that only 11% of master data management and business process management teams are co-located under the same organization or at least coordinate their activities.

Organisations like play core however have already begin to realise the benefits of aligning the two schools of thought. PlayCore is a leading playground equipment and backyard products company whose products are sold under the brand names GameTime, Play & Park Structures, Robertson Industries, Ultra Play, Everlast Climbing, and Swing-N-Slide.

Each of PlayCore's six business units has its own separate general ledger (GL) system, and PlayCore corporate has a seventh. Two of them use JD Edwards; three of them use Intuit QuickBooks, while the other two use Sage MAS 200 and MYOB.
PlayCore's challenges centered around:
  • Consolidating overall performance results from the seven separate GL systems
  • Manual, time-consuming effort to pull detailed information from the different GL systems into the BPM system
  • Centralizing class, department, customer, product, account, and company master data.
MDM enables PlayCore to fully leverage their BPM system for efficient, timely analysis. It gives PlayCore the ability to produce quick, correct comparisons and reports on Class, Department, Customer, Product, Account, Company—without worrying about the underlying GL, but with all of the necessary ties to financial statements, enabling them to drill down where necessary.

 Source Profisee.com



 

Sunday 4 September 2011

BPM and mastering data across the enterprise

 BPM solutions typically rely on core systems to supply or even master the data that they supply. In the BPM process data is retrieved, displayed and updated from those core systems as part of a business process. 

Source Cordys Master Data Management Whitepaper

The BPM solution must rely on trusted data sources used by business stakeholders to support those processes, Master Data Management (MDM) is the process of aggregating data from multiple sources, transforming and merging data based on business rules.  According to Cordys (BPM and MDM supplier) an MDM solution must have the ability to:

•       Find trusted, authoritative information sources (master data stores)
•       Know the underlying location, structure, context, quality and use of data assets
•       Determine how to reconcile differences in meaning (semantic transformation)
•       Understand how to ensure the appropriate levels of quality of data elements

This functionality can and in my opinion should be provided by a solution dedicated to Master data management.  


BPM solutions provide the means to integrate processes across siloed business applications and departments.   This also means mediating or in some cases creating a data model and attribute definitions between multiple systems and departments. Typically data models in different systems do not share the same definitions or same validation rules as more and more systems come on line during a large transformation program managing the data model becomes more complex and an ever increasing overhead to business users.  

If not managed, maintained and updated the data model will no longer support current requirements and will be unable to grow to incorporate future systems as new processes come on line.  This can result in the need to re-engineer the data model and the BPM applications that rely on them.  Data migration of in flight work items is always difficult and time consuming causing increased disruption to the business and inevitable dips in customer service.

“The challenge for a BPM program is how to manage a consistent view of data across the enterprise applications that is now displayed through a single BPM solution”

Source Kalido Master Data Management Technical Overview

Data for a BPM solution is often mastered in one or more sources systems ERP, CRM, SCM or homegrown applications.   In some cases it may be the BPM solution itself and a separate data model is created for the BPM solution. The BPM solution typically retrieves displays and modifies data from source systems through the business process, the data must be updated in those source systems maintaining data integrity and data validation rules. 

Data definition, integrity and validation rules are often different for each system, a BPM solution grows across the enterprise encompassing ever more systems.  Manually attempting to manage a data model across multiple systems and components becomes physically challenging. 

The Capgemini MDM  solution comprises of tools that automate this process and provide model driven approaches to managing data including; automated sub-processes to perform data integration , data extraction, transformations, data quality routines to cleanse, standardize and parse data, find duplicates and manage potential matching candidates, validations and checks.
 “MDM provides a model driven architecture for creating and maintaining a model across multiple systems”



Source Kalido Master Data Management Technical Overview 
Data for a BPM solution is often mastered in one or more sources systems ERP, CRM, SCM or homegrown applications.   In some cases it may be the BPM solution itself and a separate data model is created for the BPM solution. The BPM solution typically retrieves displays and modifies data from source systems through the business process, the data must be updated in those source systems maintaining data integrity and data validation rules. 
The model driven approach enables the business user to define a semantic data model for the enterprise e.g. Customer , Product, Order, Payment, Service Request etc.  Data management tools can interrogate and align the semantic model across multiple sources maintaining relationships and data integrity

The semantic model is maintained in a separate environment that is dedicated to the management of data.  Although the business can author and update the model the complex data integrity and validation alignment across systems is managed automatically by the solution ensuring a much greater level of accuracy in data validation, with significantly less effort, ensuring faster time to market.


Monday 30 May 2011

BPM, Cloud and Crowdsouring

In 2011 the world of Business Process Management went , mainstream,  There have been two Gartner events on the subject not to mention show cases by leading  vendors such as Pega Systems, IBM and Oracle to name but a few.   In addition, discussions in BPM forums are all a buzz with the latest trends as the movement  embraces other disparate technology trends to form ever more innovative business solutions.  Some of the most blogged about trends being Mobile and cloud computing.   O’reilly Radar’s computing  Jonathan Reichenthal, PH.D.  writes that “ it's clear that mobile is the new global frontier for computing” .  I believe we are also seeing a clear trend emerging from Cloud and SaaS to BPO and eventual adoption of even more unusual models such as crowd sourcing. 
BPM  and Gartner  analysts agree that the world is now appified , with  the explosion of Smartphones  IPads  we are linked into critical to processes 24 7. I find myself responding to emails from my bed at two in the morning because it’s so easy, convenient and fun.
My wife and I now compete  to find the coolest app that we can casually introduce over diner the current favourite is the bus finder which tracks the actual location of the selected busses en route you can now time the last pint at the bar to coincide with the No 48 rolling up to the stop across the road from the fox and goose
New capability such as geolocation, sensors, near field communications, cameras, voice, and touch can be integrated into functionality, there are now a proliferation and abundance of operating systems in the mobile arena.  “ Five billion people now use cellphones — about 62 percent of the planet's population — compared to less than two billion who have a personal computer. Within just a few years more people will access the Internet from a mobile device than from any other technology.”  O’Reilly Radar
The use case for mobile BPM are diverse  and exciting,  ranging from healthcare  to insurance. Doctors, for example, can now  “…show a patient a close-up of how the circulatory or respiratory systems works…” Blausen Medical's Human Atlas provides full-color, 3-D animations and illustrations about various parts of the human body.  There is also well docs diabetes manager which allows user to download the the app and manage their own care. 
 In insurance “… John Hancock is among one of many insurance annuities providser who now provide sales men with Ipads taking “advantage of the lightweight, full-color device. A Boston-based life and annuities insurer released an app, i-Illustrate, that allows producers to adjust illustrations of their life insurance products in real time while they're presenting options to consumers…”  Insurance technology http://insurancetech.com/business-intelligence/229402882
The issues of the future won’t be one of band width if the mobile providers have their way, the like of Fujitsu are to build a wholesale fibre broadband network in the UK, with the aim of offering next-generation services to five million rural homes.   The company ….will use Cisco's infrastructure kit in its bid to create the UK's second-biggest fibre network. Virgin Media and TalkTalk are already lined up to resell Fujitsu's services to consumers.   David Meyer 6 April, 2011 14:50
Aligned to this extension of business process to the consumer via mobile devices is the mounting of those same processes, in the cloud.  From 2008 – 2010, there was much talk and hesitancy around cloud based (Saas or PaaS based) services,  in 2011 service providers are embracing the technology and providing innovative services .   Frank Gens, chief analyst at global IT research firm IDC, and author of the December 2010 report “Predictions: Welcome to the New Mainstream,” states:    In 2011, we expect to see transformative technologies make the critical transition from early adopter status to early mainstream adoption of this next dominant  platform, characterized by mobility, cloud-based application and service delivery, and value-generating overlays of social business and pervasive analytics.”
The 2010 Gartner survey of BPM PaaS and cloud-enabled platform vendors confirmed this trend and revealed that a vendor that offers BPM cloud-enabled platform products is also likely to offer BPM PaaS. In fact, only 8 percent of vendors offered only their BPM platforms as a product while 59 percent of vendors surveyed offered the same capability both as a service and as a platform product.
The insurance industry throughout 2009 – 2011 has been busy, according to “Insurance technology” replacing ageing claims processing systems with modern BPM solutions.  While not a BPM solution “ Intellect SEEC is a pure-play SOA-based cloud offering comprising 10 platforms and more than 70 products. The Polaris proprietary Insurance knowledge shelving and wiring framework (L0) has more than 100 business processes, and more than 1,000 business cases documented in the areas of product management, asset/broker management and customer management, spanning new business, policy administration, claims, billing and accounting, risk management, investment and reinsurance….” http://www.insurance-business-review.com/news/polaris 
This technology ups the ante for BPO organisations and potentially changes the game in the insurance industry as more and more core services are offered on the cloud, allowing insurance vendors, to “stick to the knitting”  of product design and marketing and leaving the administration and processing to, cloud based IT service  providers.    Other organisations such as DST systems who have provided multi tenant BPO solutions to the insurance industry for the past 40 years are now providing SaaS based multi tenant record keeping as a service. 
As the entire value chain of insurance provision moves the cloud, insurers may also want to avail themselves of other low cost services, and Crowd sourcing offers one such option.  Jeff Howe is usually credited with being one of the first journalists to coin the term Crowd sourcing….. as the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call….”
There are now a vast array of websites that offer  a whole range of services from banking to Fashion design and ideas banks to Mutual Funds Management.  For example Grameen Bank provides loans to the poor in rural Bangladesh, solely based on mutual trust, without the need for collateral.   Another Example is ZOPA loans, where individuals give loans to other individuals, as they say on their website, “cutting out the banks” or a fashion design company allows customers to determine which of their designs is to be put on sale or, the “Huduma” project in Kenya which uses crowd sourcing to monitor the effectiveness of services such as health and education provision across the country. 

From these examples it is easy to see that companies will soon begin to embrace this Crowd sourcing  as a way to solve their own problems Eli Lilly, for example, uses Inno Centive a brainpower initiative launched in 2001 to provide a market for people to help develop drugs and speed them to market”  Inno Centive is now made available to a network of subscribers including Boeing, DuPont  and  proctor and Gamble who post their thorny problems on the site to be solved by eager brains with intellect to spare. 

Monday 2 May 2011

Recollections from Gartner BPM Conference London 2011

I recently attended the Gartner BPM Symposium in London, and from my observations four key themes emerged, in the era of the social network, Social CRM was topical, mobile, analytics or what I prefer to call predictive processing and of course no IT conference would be complete without some discussion of Cloud computing  and indeed the theme "BPM in the Cloud" was discussed.  

Today we will look at Predictive Analaytics, the remaining themes will be looked at in the later instalments.
Daryl Plumber, Gartner analyst, predicted that analytics would take centre stage in the next wave of BPM implementations.  We are already seeing that in CRM applications for complex cross sell and up sell decision trees for customers Chordiant, now Part of Pega systems provides one the industry’s leading CRM decision engine.   Meanwhile my colleagues and I have been defining a  defining architecture that incorporates analytics into the continuous  improvement  cycle.  We found that although many of our customers had invested  both in BPM and Analytics solutions they had not integrated them in a holistic way.  All too frequently raw process data was being fed into analytics engines to be turned into dashboards or worse still static reports. 
Managers are already churning through a deluge of data, in the form of email alerts, reports, dashboards and now tweets and blogs.  What we want to do is make processes self aware such that they change or at least suggest changes based on real time information.  There are endless scenarios and where this could be useful, an insurance company receiving a spike in calls regarding accidents from a certain make of car for example, might want to raise the premium on those models based on the increased risk. They might also want to send a letter to the manufacturer asking them to investigate the pattern  and save managers hundreds of hours by having the process suggest or even implement the changes themselves.  Rather than waiting for human intervention and potential loss of revenue, customers or damage to reputation.
With the increase in the frequency and volume of data humans are no longer able to digest absorb and make decision quickly enough.  More importantly some patterns of behaviour are invisible to the naked eye.  Today’s Analytics sophisticated analytics engines are able to read, absorb and interpret, terabytes of data detect and find patterns.  These patterns can be translated to rules which we can then incorporate into our processes.   We call this predictive analytics and targeted solutions  are coming soon.