Monday, February 19, 2018

IT is undergoing seismic change as next-generation technologies drive digital business transformation


 


How will you build the business case for the digital transformation - or simply your next cloud transition? We believe the answer is:


1.     to assess cost by fully-burdened workload and 
2.     address the full lifecycle cost including current and forecasted production costs as well as the often missed or underestimated migration costs.

We call this blind spot "The Gap".

To address this need, The TCO Alliance, a partnership of the International Institute of IT Economics and IT Business Decisions is pleased to announce Workload TCO Analyst.

Key Issue: What is the true cost of IT transitions?
Solution: Perform a rigorous financial analysis of:
·       Major on-premises and managed services workloads
·       Develop workload baseline costs, future scenarios, and decision options
·       Using detailed “puts & takes” to show financial impact, performance tracking and “what if?" analysis

HOW WORKLOAD TCO WORKS
Very recently, in an enterprise not so very far away, a decision was made. The decision was how to best modernize an on-premises legacy HR application.

Situation: The 10-year old HR application was highly customized to meet the changing business needs of the enterprise.  It was integrated with several other applications and data sources. However, it was showing signs of age. It was increasingly down for maintenance and unplanned downtime; it was pushing the capacity of the infrastructure; and it was not able to easily meet the new demands of new social media and other workplace interfaces. This application is one of many workloads requiring financial analyses to assess the cost of moving to the cloud.

Task: Senior management has mandated a cloud-first strategy. This requires all new applications to be cloud-based (private, public or hybrid) and any new investments to prefer cloud solutions. The driving forces behind this strategy are that cloud platforms may cost less and require less capital investment, enable a service-oriented strategy, and a more agile, responsive IT function.

As this was a >$1M project, senior management oversight and due diligence dictated that all options be put on the table. The most viable options were to: 
A.    Do nothing
B.    Upgrade the on-prem application. (Vendor support for the current version would cease in 18 months)
C.   Modify the existing workload, make it “cloud-ready” and re-platform it, which implied refactoring, microservice APIs and containerization. 
D.    Abandon the legacy application and move to a commercial SaaS offering

Approach: There were many decision factors, but a primary driver was economic. Which approach would cost less over time? This enterprise employed a process called Workload TCO Analysis (WTA) to compare the above scenarios. WTA is based on the concept of a Fully Burdened Workload (FBW), where TCO can be assessed for any discrete workload.

The 8-week project included: 

 
1.    Perform an analysis of the current workload TCO by building a baseline resource / asset model 
2.    Forecast the baseline forward with a fact-based forecasting tool 
3.    Develop a TCO scenario for each option, perform an in depth financial analysis of each path forward

Result: The engagement developed a defensible financial analysis that could be reviewed and understood by senior management. The decision was made to take option D, sunset the existing application and migrate to the cloud-based version from the same vendor within the next 12 months. The organization originally compared the SaaS costs to the on-prem costs. The WTA identified additional key costs such as multiple APIs, business process and governance changes, and SaaS staff support costs. 

The benefits of enabling new functionality and complying with the cloud-first strategy were the business justification drivers. Management had a clear expectation of the migration and long-term operations costs of the application.

The project produced the additional benefits of a baseline IT cost model that will be used to: 
·       Build a strong financial foundation underneath your IT roadmap
·      Build zero-based annual budget with what if capabilities
·      Make other point decisions as new workloads are assessed
·       Build a fully loaded IT service cost consumption model

About The TCO Alliance
The TCO Alliance is a partnership of independent, IT financial analysts and modeling experts formed to solve IT’s most critical cost assessment challenges. 
 
The principals are:
Bill Kirwin - Inventor of TCO at Gartner. Founder, International Institute of IT Economics (IIIE)

Bob Multhaup - Former CIO, Inventor of ITin3D© cost modeling tool 

Peter Brooks - Business Value Consultant. Partner, IIIE

And a team of independent, IIIE certified IT financial assessment professionals

What We Do
Show fact-based financial future options to for making major
planning decisions
Deliver in-depth expert consulting and a working cost model of their optimum path forward

Our Value
Quantification of the optimum financial path forward with in-depth, proven tools to avoid pitfalls between digital transformations
Rapid, in-depth, affordable decision support tool to avoid major miscalculations in moving from one ‘S’ curve to the next
Our solution is scalable, flexible for point decisions or major planning initiatives
More effective than biased vendor-driven analysis and expensive management consulting

Our Unique Deliverables
We use ITin3D©, a powerful financial modeling and decision support tool to:
·       Defensibly communicate the transition costs to senior management
·       Perform scenario options including financials impacts to evaluate the ‘best’ path forward
·       Flexible planning, track actuals, rework plan – ability to easily rework the plan to adjust for changes to re-forecast financial impacts using the ITin3D© model

TCO Alliance Models work for any IT change scenario - BYOD, Data Center Migrations, Sourcing, Digital Business Transformation. If you can think it, we can model it.

Contact:

The TCO Alliance

info@iiievalue.com


Thursday, February 1, 2018

The Road to Workload TCO – The New Unit of Measure For IT Cost



By Bill Kirwin

When I first created TCO, it was a way of understanding the full life-cycle cost of assets. The basic unit of measure was the thing, a PC, a server, and / or all the other stuff commonly found from the desktop to the data center. My advice to Gartner clients was “when you open the shipping boxes, out come the computers, and an assortment of people from the help desk, system administration, security, procurement, asset management, IT management, and end users.” The TCO was based on the physical asset and all the related labor and service costs were bundled into that asset. This was the logical way to look at IT in the 1980s and 1990s. TCO was designed to identify and manage the cost of the IT asset. This was also helpful in assessing the cost differences between vendors of those assets – famously between Wintel and Apple assets. If you wanted to know your end user help desk costs you could multiply the unit cost of help desk / PC and the number of PCs and it was a reliable metric. As a management tool, It conformed to the siloed nature of IT disciplines.

In an early wave of democratization, TCO was also a way for IT to demonstrate the value added beyond the base cost of the PC, to counter the end user argument that the “same” PC was cheaper at Fry’s or Crazy Eddie’s.  

Then in the late 1990’s IT organizations followed the lead of the vendors and started restructuring their deliverables as services, collapsing the siloes (data center, desktop, networking, applications) into ITIL services, then packaged offerings like Infrastructure as a Service, ERP as a Service, Business Process as a Service, Mobility as a Service, etc. Then Service Catalogs were developed as the marketing face and cost recovery structure for IT. While Gartner predicted ITaaS was the way to go, it has taken IT a long time to evolve as a service provider. In the mean time the vendor side of the market has completely reinvented itself in the service model. Today, very few IT providers have pure play asset offerings.

Meanwhile the technology has become more abstract. Virtualization caught fire and redefined assets like servers and desktops. This has evolved to the software defined data center (SDDC) as networks and storage are virtualized. 

The abstraction of physical assets was a key driver of new ways of thinking about IT cost. Now, instead a of a one for one replacement of a device with a cheaper, better, faster, or easier to use device, other options emerged. These options drive multiple scenarios for an asset; It can be replaced with another asset; it can virtualized; it can be offered as a service; it can exist in the cloud. Financially, the asset can be capitalized, or take one of several operating expense options. These were critical forks in the road to determining the best TCO decisions.

Early into this phase of IT evolution, we renamed TCO to TCS (Total Cost of Services). The TCS tool repackaged the resources into a cost per service mold. This has been very helpful to enterprises that need to have a cost underpinning for their service portfolios. It has also been useful as a decision support tool to assess the cost differences between commercial offerings. The question was no longer which server has the lowest TCO, but which IaaS offering has the lowest TCO. 

At the same time as vendors and buyers moved to a service based model, another massive wave democratization of IT landed. IT no longer had a captive market. The business users were shopping the market for IT capabilities. IT has been forced to compare itself to market pricing, again to demonstrate its competitive value and establish a base for chargeback or showback. Without a cost basis, IT is caught in the conundrum of being perceived as both “free” and “overpriced” depending on the situation.

Now cloud computing has emerged and virtually every IT function is available as a service and is priced by consumption. Common iterations of cloud include IaaS, PaaS, SaaS delivered on public, private or hybrid platforms.

With the advent of virtualization and cloud computing, we started talking in terms of workload. The concept of a workload is simple – it’s a complete unit of functionality. Defining a specific workload is trickier. 

In the virtualization era, workloads were asset defined, and while there are nuances like the fact that virtual machines need to run on physical hosts, traditional TCO analysis could still be applied. A virtual server is a server; a virtual desktop is pretty much a PC. 

In the cloud era, services are unitized into workloads that are created in virtual domains. The workload can be crafted to the type of cloud offering that is a best fit. Let’s use a traditional HR application, running on premise on physical or virtual servers. The entire application can be viewed as a workload. Or just the infrastructure can be isolated as a workload. Same workload, vastly different cost analysis. This workload could also be looked at compared to a SaaS offering with similar functionality. Again, same workload, different cost analysis. 

Please note that the principles of TCO remain the same. TCO is a holistic view of costs, across enterprise boundaries, over time.  It consists of a collection of cost items – the resources (Chart of Accounts or Bill of Materials) and how much of those resources are consumed by the workload.

However, workload TCO creates additional complexity to the analysis. Multiple scenarios, using the same resources, but with different allocations need to be projected. The tools that worked for asset based TCO and even for TCS need to be more sophisticated, flexible and iterative.

This is why The TCO Alliance created Workload TCO Analyst. WTA is a powerful tool and methodology that will provide in-depth, fact-based cost baselines and projected future scenarios. Please contact us at: info@iiievalue.com or simply by calling me at 203.215.7717 for any questions on how this may apply to your project needs.

HOW WORKLOAD TCO ANALYSIS WORKS

Very recently, in an enterprise not so very far away, a decision was made. The decision was how to best modernize an on-premises legacy HR application.

Situation: The 10-year old HR application was highly customized to meet the changing business needs of the enterprise.  It was integrated with several other applications and data sources. However, it was showing signs of age. It was increasingly down for maintenance and unplanned downtime; it was pushing the capacity of the infrastructure; and it was not able to easily meet the new demands of new social media and other workplace interfaces. This application is one of many workloads requiring financial analyses to assess the cost of moving to the cloud.

Task: Senior management has mandated a cloud-first strategy. This requires all new applications to be cloud-based (private, public or hybrid) and any new investments to prefer cloud solutions. The driving forces behind this strategy are that cloud platforms may cost less and require less capital investment, enable a service-oriented strategy, and a more agile, responsive IT function.
As this was a >$1M project, senior management oversight and due diligence dictated that all options be put on the table. The most viable options were to: 
A.    Do nothing 
B.    Upgrade the on-prem application. (Vendor support for the current version would cease in 18 mo.) 
C.   Modify the existing workload, make it “cloud-ready” and re-platform it, which implied refactoring, microservice APIs and containerization. 
D.    Abandon the legacy application and move to a commercial SaaS offering

Approach: There were many decision factors, but a primary driver was economic. Which approach would cost less over time? This enterprise employed a process called Workload TCO Analysis (WTA) to compare the above scenarios. WTA is based on the concept of a Fully Burdened Workload (FBW), where TCO can be assessed for any discrete workload.
 The 8-week project process was: 
1.      Perform an analysis of the current workload TCO by building a baseline resource / asset model 
2.     Forecast the baseline forward with a fact-based forecasting tool 
3.     Develop a TCO scenario for each option, perform an in depth financial analysis of each path forward

Result: The engagement developed a defensible financial analysis that could be reviewed and understood by senior management. The decision was made to take option D, sunset the existing application and migrate to the cloud-based version from the same vendor within the next 12 months. The organization originally compared the SaaS costs to the on-prem costs. The WTA identified additional key costs such as multiple APIs, business process and governance changes, and SaaS staff support costs. The benefits of enabling new functionality and complying with the cloud-first strategy were the business justification drivers. Management had a clear expectation of the migration and long-term operations costs of the application. 

The project produced the additional benefits of a baseline IT cost model that will be used to:      
  • Build a strong financial foundation underneath your IT roadmap 
  • Build zero-based annual budget with what if capabilities\
  •  Make other point decisions as new workloads are assessed 
  • Build a fully loaded IT service cost consumption model