(part 2 of a stream of thoughts on what business decision architectures are required for enterprise decision management, how to align it with business goals, and how to provide the right information to support the decision architecture. This is all work in progress and there will most definitely be contradictions between posts as my thinking develops and I get to test it properly. Comments and conversation are welcome!)

When reorganising a business to align its activities with strategic goals, a commonly seen and successful approach is to agree a target operating model which defines the future shape of the business, from markets and channels through information and people to infrastructure and locations. All too often, the critical decisions associated with executing many layers of the strategy are left out of the process layers, are left undefined or are defined inconsistently.

A target operating model is over and above everything else a communication and agreement mechanism. Without including decisions and their analytic requirements in the future design of the business, process improvement through decision management is going to be a false start. A TOM is the chance to get decisioning in front of senior management in a big way. In order to optimise operational and tactical decision making in the future then we need to build it into the target operating model at the start. A target operating model will typically will show the desired layers of the future business in line with a specific business strategy, e.g., growth/increased market share, operational efficiency, cost reduction etc.

  • The benefit of the operating model is that it is discussed in business, not IT language, and allows senior stakeholders to agree the direction of the business and what is required behind the scenes to support the strategic goals.
  • The model will include a variety of ‘layers’ including the future markets (including customer segments) products to be sold to those customer segments, processes to support this, information required for these processes, people, infrastructure, facilities, locations etc.
  • The creation and agreement of a target operating model is the right time to think big about how you want the business to change. If driving operational efficiency or cost savings is part of your goal, then you will want to think about increasing efficiency and reducing cost through better decision making at all levels of the business. If growth is your goal then you will want to look at how your decision making can increase revenues and profitability.

The main affected layers of the operating model are:

  • Processes – high level descriptions of each business process (e.g., loan sales) need to be assessed for the high level decisions and made explicit in…
  • A new layer: Decisions – definitions of the key operational and tactical decisions made at each point (e.g., Assign a sales advisor, Pick the right product, Approve/reject a loan) based on…
  • Information – the information required to support each decision (e.g., customer history, sales person history and training, product details, proposed loan value) and whether analytic models are required. Also, information will be needed for other purposes than decision-making, for example corporate data (i.e., financial, AP/AR, system of record stuff) and process data such as customer contact details, manufacturing process management data. (How much ‘management information’ is actually required if it is not for decision making. Another one to discuss.)

The decision layer here would essentially be a ‘level 0’ or ‘conceptual’ level of your business decision architecture but should include the key decisions that you will want to make in the future business. The next step is typically to drill into the architecture to flesh out the details and to-be target maps

There will be further impact on layers such as technology (are there investments needed), people (who are the experts, who will need to be trained), location (can making better decision-advisory data available allow outsourcing) but they are questions that need to be answered anyway and will be more affected in the detail.

So, things I need to follow up on:

  • What does the decision architecture look like? (follow up from previous post)
  • How does it combine with process and information architectures (the decision catalogue and the key classifiers, the key links between layers etc) and other TOM layers
  • What is the impact of decision-orientation on the information architecture? How do we need to re-classify information needs?

Comments welcome!


Decisions are made at every level of the business and both within standard business processes and on an ad-hoc basis. Each option available a decision has an inherent risk and benefit to the organisation’s value/objectives/results.  The risk or benefit is the outcome of the choice made whether the decision was right or wrong.

However, very rarely does anyone actually understand all the decisions being made in an organisation. Decisions are often left out of architectures and form a minor part of process maps. It follows that the aggregate risk/benefit of getting each decision right or wrong is often unknown at a management level. So how can we improve the quality of our decision-making processes to improve the aggregate benefit to the organisation?

We need to know the impact of decisions already made in order to predict whether each choice will be the right one. We need to know what decisions are being made, how often they are made and whether they are part of a structured process or ad-hoc. For each decision, we must know the choices available and the likely risk and benefit of the choice, in the case of the decision being ‘right’ or ‘wrong’. We need to provide appropriate and accurate information to the player making the decision and track the results of the decision based on the possible choices and the information. We need to know the assumptions used as the basis for the decision so that we can manage the feedback loop and improve the risk/benefit of the choices next time.

The old ‘decision architecture’ that provided the basis for ‘decision support systems’ focused mainly on strategic, executive level decisions. An enterprise decision management approach would segment all enterprise decisions as required, one set of segmentation criteria being the level of automation possible and another being the value at risk/benefit of getting it right or wrong. Behind this is the level of judgement required. The new decision architecture will address, within an enterprise architecture approach, the elements that we discussed in the paragraph above:

  • Decision catalogue
  • Fit with (or outside of) business processes
  • Repeatability of decisions
  • Typical choices for each decision
  • Implicitness or explicitness of the decision
  • Likely impact (risk/benefit) of making each choice
  • Aggregation into decision yield
  • Information requirements
  • Inherent assumptions (and whether these are decisions themselves)
  • Feedback loops

In my next posts I will discuss (and evolve) these thoughts in more depth.


I haven’t blogged for a while but thought I would after reading Henrik Liliendahl Sørrensen’s post on 55 reasons for data quality. Henrik focused on customer master data, I’ve focused on key areas in financial and MI data – so charts of accounts and organisational hierarchies. So it’s probably of more interest to your CFO or financial controllers than the operational people.

  1. When you have incompatible or different charts of accounts across the group there may be inefficient accounting treatment of transactions. 
  2. When the management and legal views of the company are inconsistent reconciliations and closes will increase the time taken to close the books.
  3. Incomplete or out of date maps between central accounts and local core platform transaction types may lead to incorrect regulatory, tax or accounting treatment of transactions.
  4. If validation rules are not consistent between countries and business units in consolidation then there may be duplicate or missing summary information supplied for management information.
  5. When you have incompatible or different charts of accounts across the group there may be inefficient tax treatment of transactions. 
  6. When you have incompatible or different charts of accounts across the group then the close cycles may be long compared to your competitors, leading to out of date management information for decision making and long reaction times.
  7. When the management and legal views of the company are inconsistent planning decisions and investments made at various levels may be counterproductive to overall strategy because there isn’t a clear view of the overall strategic numbers. 
  8. When accounts are coded incorrectly in ERPs then there may be transactions booked to incorrect accounts, leading to incorrect tax, regulatory or accounting treatment.
  9. Incomplete or out of date maps between central accounts and local core platform transactions could lead to increased cost of close including reconciliations and review. 
  10. The increased effort to reconcile different charts of accounts across group will lead to increased audit effort and higher audit costs.
  11. When you have incompatible or different charts of accounts across the group then the close cycles may be long, leading to planning phases being shortened or incompatible.
  12. When the management and legal views of the company are inconsistent internal customers may not be treated correctly which may have an effect on transfer pricing.
  13. When the organisation structure is inconistently implemented in line of business systems and ERPs monthly and quarterly closes can be extended due to the number of reconciliations and reviews.
  14. Incomplete or out of date maps between central accounts and local core platform transaction types may lead to regulatory or tax investigation and hence increased costs.
  15. When you have incompatible or different charts of accounts across the group visibility at group level of available working capital and/or regulatory capital could be decreased leading to reduced ROC.
  16. When the management and legal views of the company are inconsistent the finance function may not be able to deliver a strategic service to the business units of the organisation.
  17. When accounts are coded incorrectly in ERPs then there may be transactions booked to incorrect accounts leading to increased audit and tax compliance advisory costs.
  18. When the organisation structure is inconistently implemented in line of business systems and ERPs there may be an effect on investment and planning decisions  that rely on correct information.
  19. Incomplete or out of date business rules on inter-company transactions could lead to increased tax charges or investigations into transfer pricing.
  20. Incomplete or out of date maps between central accounts and local core platform transaction types may lead to duplicate or missing information for decision making and planning.

Comments welcome!


Question: How has your business utilized Business Intelligence (BI) and how do you see this developing in the future? Do you see this is a separate departmental function or a core attribute for senior managers and executives?

My answer:
I’ve seen a range of clients with different approaches to business intelligence, at various levels of maturity across different dimensions, including data quality, integration, automation, business buy-in, strategic vs tactical vs operational.

Most of these have been successful at some level in providing value through information delivery to support decisions. The really successful ones see that the exploitation of and insight into information comes from top down so that top management look at all the numbers (not just the typical financial ones).

The delivery of effective business intelligence to the senior management sponsorship then needs to be across their scope of influence – i.e. consistent across the company.

So to answer your question directly, a data-driven decision making mindset at the top level is needed for effective utilisation across the company (these top people don;t need to be ‘power-users’ but they will have the power to make things happen). Then the rest of the organisation will follow in their stead.

The leading state of delivery of business intelligence (i.e. the BI department) is a cross-organisation ‘competency centre’, ‘centre of excellence’ or similar where there is a core of skilled, experienced business intelligence architects and developers, who work closely with department-level business people and technologists to deliver optimised BI solutions to the business, using the most relevant areas of the organisation’s knowledge.

Business Intelligence is evolving towards an integration with search technology in one direction (give me the answer i need based on these search terms), integration with business rules, predictive analytics and business process management to give ‘decision management’ in another, and more integrated reporting and planning (which has been evolving for years) in another.

These aren’t as different as they seem and all the threads have one thing in common – essentially, a real need for a quality information management approach across the organisation (source systems, master data, data warehouses) which should not be underestimated


Speaking yesterday with a colleague, we got around to discussing ‘information as a service’. My colleague is a vendor analyst by training before joining us so keeps a close eye on what the vendors are saying. His view was that the vendors are saying that information as a service is all about holding data in memory, speed and very physical features and benefits like that.

Now this may be the case but this seems like the typical techie approach – making the concept and the supporting/enabing technology into the same thing – when they’re not.

Information as a Service (in my world) is the concept, approach and architecture that brings consistent information to other systems in the organisation through a one to many ‘service’ interface. So for example there is a customer service which all other systems should use to get their customer information. This could be real-time but doesn’t have to be. There could be a variety of levels of ‘service’ within the same service (e.g. ID and name may be enough for one system, another may also need billing address, invoice address and preferred shipping info).

This may be delivered using web-service architecture, it may be delivered using ETL or EAI tools, it may be delivered using special ‘information as a service’ tools. The tools aren’t really important. What is, is that the information – whether master data, analytics or other information – is accurate, consistent and trusted in the data source and that it can be delivered to the systems that need it when they need it.

So information as a service is about getting the master data right, or the analytics right and the processes around that then using the software as an enabler, rather than the software being the be-all and end-all.


Doug Henshen at Intelligent enterprise blogged about a speech at the TDWI by Tracy Austin, former CIO of Mandalay Resort Group and former VP of IT at Harrah’s Entertainment, one of the most celebrated BI-driven enterprises in the world.

1. BI must be business driven, tied to measurable business goals. If BI is currently IT driven, find a way to evolve it into a business-driven initiative.
2. Data management has to be in place. If data quality isn’t there, no amount of cleaning will make it work. Just say no to more BI work until you can put proper data management and data quality in place.
3. Ensure the right mix of business and IT people. “Mandalay had competent people who were used to working in silos, but you can’t operate on your own when it comes to the data warehouse. You need architects and data modelers and people who can bring everything together and tie it to the larger strategy. Even if tools are there and the data is good but the people aren’t in place, it’s going to fail.”
4. Focus on quick measurable wins, not big, monolithic projects.
5. Create a formalized marketing plan. You have to sell the BI program to top executives and the entire organization. “You can even sell it to Wall Street, as we consciously did at Harrahs.”
6. Base BI investment decisions on business value. Ditch the smoke and mirrors or black-box approach. Have the rigor to commit to quantitative and qualitative deliverables and follow up with reports on progress toward goals.
7. Institute joint business and IT planning. Gather key business and IT leaders once a month so you get into a proactive mode. Let the business people tell you how they’re using BI and how they would like to be using it so you can plan the next releases.
8. Foster a business-savvy IT department. The more you can familiarize your IT people with the business drivers and business problems the better.
9. Develop the right BI architecture to meet the goals. Some organizations build as they go and end up with underpowered infrastructure. Some build everything at once in a big-bang project and they end up with overblown architecture. The best approach is to plan and architect in advace and then build as you go, spending one step at a time.
10. Optimize the human and information resources. Institute continuous measurement and continuous improvement. BI is not something you put in place and forget. You have to go back and reexamine the fundamental assumptions and success of existing projects.

These points are great. What is noticable about them is that as well as ‘BI’ they generally apply to what I’d see is good practice in general – whether for BI projects, EDM projects, data management projects, business rules, enterprise software, knowledge management projects.

In fact, rather than just talking about BI, I think Tracy is doing a great job of selling what I would read as agile (small a) development methods from a business focus. This so, so needs to be done as agile has been the domain of the IT crowd for far too long. Technology and the business are like ying and yang (or at least they should be) and doing agile business and agile technology sit so well together in allowing you to outperform your competitors.

(another)thing that’s clear hear is that 7/10 of the points are about culture, marketing, strategy and people. The hard soft stuff. Only 3 are about tech. I think Tracy’s approach to being a CIO is spot on here. Great piece.


Vinnie Mirchendani blogged today about how GE are using essentially BAM on their leased turbines to predict downtime and then share cost savings with their customers. The original came from Booz, Allen.

http://dealarchitect.typepad.com/deal_architect/2008/02/ge-siemens-and.html

This is such a great product of operational intelligence. Vinnie mentions how it could be done for outsourcers – coming up with value pricing options for capex investments. Someone will do it first. Wipro or Tata?

It got me thinking about which other industries this could apply to –

  • Premium car manufacturers (my german car is totally controlled by computer) could use the data they get from usage at services to offer more relevant warranties, discounts on services, upgrades?
  • The capex model/value pricing will work on trains, buses and other public transport systems. The public firms would push this further
  • Banks already use predicitve risk models to price credit & loans for their clients (then go and stick their money on flaky investments themselves)

Name change

20Feb08

I’ve just changed the name to information and decisions, to more accurately reflect the topics that I am interested in the management of. Nothing else has changed however!


http://marketingroi.wordpress.com/2008/02/19/debunking-marketing-myths-single-view-of-a-customer/trackback/

Ron Shevlin, on his ‘Marketing Whims’ blog, blogged about single view of a customer being a ‘marketing myth’.  ‘

Ron – My take: Truth is, most companies don’t know what a single view of the customer is, and many place way too much value in the concept.

Looking at it from a pure marketing point of view, I’d say ‘maybe’. I think Ron focuses in his article a bit too much on the data rather than the applications of that data.

It would be foolish to give a CSR access to every data element available, as Ron suggests, but having all that information ready to provide the CSR with the appropriate action to take is a much better way of doing it. (It still relies on having somewhere near to a single view of the customer. At least, it absolutely requires a single record for each customer, even if the integration of all systems is not included. ) If there is a way of measuring the the results of the decision made then even better – so this can be fed back into the analytic model that provided the action to take in the first place.

 Also, there are non-marketing requirements for a single view of the customer- in the sales to delivery process, pretty much everywhere in a bank, especially in investment banks between the front, middle and back offices. Within these processes are very strong arguments for the development and maintenance of a single view of the customer in most organisations.


The BBC are showing a documentary (Horizon) this evening (12 Feb 08)  about making better decisions. It’s written up on their website.

Basically it’s a presentation of Geek Logic by Garth Sundem, a book about equations/models that can be used to make better decisions in daily life, such as should you go to the gym, or should you apologise about something. Mathematical modelling for those everyday occasions.  Its a nice point, if a bit tongue in cheek.

The question I have is how effective are these models? The more I look at the examples on BBC and play with the numbers the more I see the holes in the equations – particularly how sensitive they are to one variable over another. And the very simple decision guide (if B>1 then buy it) isn’t very useful at all.

The idea is nice but the implementation here is trivial.

EDIT

I’ve been thinking about this a bit more and the trouble I’m having isn’t with the equations – which pass basic mathematical modelling tests but I’m still not confident enough in them, but the data that passes as parameters for them. For a bit of fun, its great. To use a similar process to actually automate decision making requires much better control of the parameter values and outcome bands. These data need to be trusted on the way in in order to make sense on the way out. Subjectiveness in data entry is a killer for automating decisions – someone is making a decision on the value of most parameters. This needs to be objective!

I’m looking forward to watching the TV show this evening (or maybe on iPlayer) to see if it is a ‘just a bit of fun’ sell or they are using it as a real technique.




Follow

Get every new post delivered to your Inbox.