December 20, 2013
    With the massive uptake in virtualization and more recently varying types of cloud computing, the pace of change in today’s IT environments has never been faster. Under the covers, commoditization of underlying hardware and even software platforms promises to drive computing costs ever lower. Offsetting this dynamic is the increased level of complexity, creating new service risks as well as increasing the likelihood of overspending.

    Traditionally, IT costing efforts have been done, if not all, at a higher or macro level. For example: total capital costs for data center construction along with associated annual operating costs for things like power floorspace cooling etc… Or, budgeting for server or storage resources on a yearly basis based on forecasted business growth scenarios. In today’s distributed systems world, any type of cost allocation has been, in most cases, coarse at best. Sometimes IT costs will be equally shared by all organizations using the total infrastructure but this approach leads to, at best, political tension, and at worst drives organizational behaviors towards acquiring access to resources outside the influence and control of IT policies and procedures.

    Most Financial organizations have (somewhere!) some type of asset database that includes information on all data center resources, when they were purchased, the price, some type of amortization schedule, and some level of annual operating expenses associated with these assets. Typically this information is owned and controlled by the financial side of the organization. Additionally, there is typically some source of information that relates these assets to business units, services, and/or applications that they are used to support.

    Most IT Operations organizations have multiple tools that monitor and measure the availability and performance of all IT technology resources. Furthermore they have one or more sets of tools and approaches by which they are measuring their ability to successfully deliver service to their various lines of business as well as and customers.

    Most data center management teams have a fairly complete understanding of their data center floor: power capacity, equipment footprint layout, total cooling capacity, and costing information such as cost per square foot.

    To date these three disciplines within organizations have traditionally never enter operated in coordination with anything other than anecdotal, ad hoc, or manual communications. But there is huge opportunity for value added through close collaboration, the goals of which should include:

    1. Finance wants to measure the value of IT but has no way of putting a currency value on the business work that IT resources are actually accomplishing.

    2. Data center management wants to optimize the cost of the data center but has no good way of understanding how much work the data center is or could support over time.

    3. IT operations want to cost-effectively ensure the delivery of acceptable service within their ever declining budget constraints.

    Each of these three main domains have a very large multibillion-dollar solution ecosystem built around optimizing use cases within each domain individually. For example there are hundreds of server, storage, and network management and monitoring solutions for performance and availability management of IT resources. There are many dozens of DCIM solutions for data center management of the physical data center. And there are a plethora of solutions for financial and asset management. All of these solutions were designed around use cases that were solely within their domain and therefore capable of accepting only metrics and data sources from within those domains.

    Until very recently software solutions have not existed that would allow or facilitate a more seamless and productive collaboration across these organizations in support of achieving these goals. However, recently new technologies in data access and analytics are lending themselves to productively attacking this challenge of intelligent and proactive collaboration across these disciplines and toolsets. There are two main philosophical approaches, with concomitant benefits and downsides: data warehousing, big data type approaches, and federated data access type approaches.

    Category: data-center