Translate

Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

January 16, 2017

The Arc-E-Tect's Predictions for 2017 - API's and Webservices [2/10]

The Arc-E-Tect's Prediction on API's


It's 2017, meaning 2016 is a done deal, and most of my predictions for 2016 (I predicted them about a year ago, but never got around documenting them in any form) have proven to be absolutely bogus. Which leads me to come up with my predictions for 2017 and properly document them. So continue your effort reading this post and hopefully you'll enjoy reading them. Unfortunately we'll have to wait for about a year to find out to what extent I got it all right, but for now, ... API's!

Why API's? Well, API's are all the rage and everybody and their mother is working on platforms, and as we all know; Without API's there is no platform.

API's in, Webservices out

Okay, in 2017 we'll feel ashamed when we talk about web-services and SOA. Instead we'll talk about API's. This is closely related to my first prediction on Microservices, which you can read here.

API's are basically another word for webservice to many people. So there's not much different here. Just as with the Microservices, we'll see mentioning of API's more and more where in the past we talked about webservices. Until people are referring to platforms, something that is really picking traction. Still I'm not talking about platforms, but about API's instead.

The reason for this is their strong relationship with Microservices. Delivery of a platform is a strategic decision that defines the direction in which an organisation is thinking about products. API's, which expose the functionality of a platform, are products. You can read all about this in my series on API Management on Azure, which you can find here. Other than web-services, which are pieces of functionality and/or data exposed via a well defined interface, API's are always targeted at an external consumer of the service. In other words, an API never knows who's calling nor does it make any assumptions about who's calling. Web-services on the other hand might very well be limited to a known set of consumers and make assumptions about their consumers.

The promise of web-services, predominantly a decoupling between functionalities in an IT landscape, is limited to those case where the web-service, or rather it's interface is treated as an API. API's, almost by definition, are bound to deliver on the promise of decoupling. When developed properly, taken care of and fronting independent pieces of software, API's are the closest thing to silver bullets in software we've come across so far. And since we love silver bullets, API's are in, web-services turned out to be just plain old regular bullets, so they're out.


Thanks once again for reading my blog. Please don't be reluctant to Tweet about it, put a link on Facebook or recommend this blog to your network on LinkedIn. Heck, send the link to my blog to all your Whatsapp friends and everybody in your contactlist.But if you really want to show your appreciation, drop a comment with your opinion on the topic, your experiences or anything else that is relevant.

Arc-E-Tect

January 5, 2017

The Arc-E-Tect's Predictions for 2017 - Microservices and SOA [1/10]

The Arc-E-Tect's Prediction on Microservices

It's 2017, meaning 2016 is a done deal, and most of my predictions for 2016 (I predicted them about a year ago, but never got around documenting them in any form) have proven to be absolutely bogus. Which leads me to come up with my predictions for 2017 and properly document them. So continue your effort reading this post and hopefully you'll enjoy reading them. Unfortunately we'll have to wait for about a year to find out to what extent I got it all right, but for now, ... Microservices!

Why Microservices? Because they're not just a hype anymore, but are moving into the realm where people are applying them because they're useful and not just cool. So my prediction is:

Microservice in, SOA out

That's right. In 2017 people will start looking at Microservices as something that is useful and way better to have in your architecture than services. So a Microservices Architecture will replace Service Oriented Architectures in 2017.

What does this mean?

Well not that much at first, until one realizes that an important reason for SOA, the DRY principle is no longer valid. DRY, Don't Repeat Yourself, is an important aspect of procedural programming that got more attention in object oriented programming and became one of the even more important aspects of SOA.
In procedural programming developers introduce procedures to have a single implementation of an algorithm or procedure. By calling that procedure over and over again instead of programming it over and over again, the developer reduces the chance for errors in the code.
Then came object oriented programming and procedures, methods, were clustered together in classes to ensure that concepts and their behaviour only had to be programmed once and used often. It's a flaky definition, but covers my intention. So re-use was immensely important for object oriented programming and this is where DRY became an important architecture principle, even a pattern.
After some time, developers started moving towards services, web services that is. In OO, Object Oriented, there's a great practice to separate the interface to a class' methods from their implementation. Depending on the programming language you use, you'll use a variety of techniques to accomplish this. In an SOA, and yes, I'm a bit short on the elaboration here, we separate the implementation of a service from the interface to access it. But the whole idea is that you implement certain functionality only once, the service, and use it from around the world. By sticking with the interface you can change the implementation without affecting the callers of the service.
So in SOA, Service Oriented Architectures, re-use is huge and DRY is a pattern.

Here come the Microservices. Microservices are not tiny services, instead they are completely self contained pieces of business concepts. They're independent of each other. They can use each other, they can rely on one another, but they don't depend on each other. That's a key principle of Microservices. It also means that if both use the same algorithm or other functionality, they both implement it. Meaning that the code is not reused! Because if one would reuse parts of the other, it would depend on it. So, in a Microservices architecture, DRY is an anti-pattern.

In 2016 everybody started doing Microservices, but actually they implemented an SOA and called it a Microservices architecture because SOA was, well, old school. An SOA has become something tainted because for one most enterprises confuse an SOA with an architecture build around an ESB and they're strengthened in their belief this is correct by the ESB vendors. Which is of course just marketing horse dung as you can read here. In addition, SOA's have hardly ever delivered what vendors and architects promised: Agility, lower costs of change and independence of clients (consumers) and servers (producers). Why? Because ESB's and by association SOA's make architectures more complex and harder to change although they hide their complexity inside the products, making them look less complex. But ask yourself: Did you ever try to migrate from one ESB (version) to another? 
So we drop the SOA and brought in the Microservice to attract new developers.

Fortunately, Microservices are hard and costly to develop and maintain. From day one. Therefore all adopters of this new and shiny architecture had to reconsider their choice to move to this new architecture at a very early stage. Realising that understanding is more important than knowing and none of the vendors had a ready out-of-the-box Microservices solutions that they could sell, the level of knowledge and understanding is already in 2017 such that Microservices will start to replace SOA's.

In 2017, companies will drop service oriented architectures in favour of Microservices architectures instead of implementing SOA's with Microservice look-a-likes.

Thanks once again for reading my blog. Please don't be reluctant to Tweet about it, put a link on Facebook or recommend this blog to your network on LinkedIn. Heck, send the link to my blog to all your Whatsapp friends and everybody in your contactlist.
But if you really want to show your appreciation, drop a comment with your opinion on the topic, your experiences or anything else that is relevant.

Iwan

November 21, 2014

The not so disruptive nature of the cloud - Centralized vs. Democratized

So, as the title of this post suggests, I want to discuss the disruptive nature of the Cloud, or rather the cloud not being so disruptive at all. This will be a series of 5 posts in total, you're reading the fourth post.

Read about the cloud and what it means and you're bound to read that the introduction of the Cloud, the real Cloud, the one that meets all criteria for being a Cloud, has been disruptive.

I like the NIST definition of Cloud, it is quite comprehensive and less vendor-biased than Gartner's defintion.

The Cloud has been disruptive when it comes to the IT industry, especially the hosting market. But it has also been a force in how we handle IT within the enterprise. There are a few important aspects of IT in the enterprise that we consider to have changed due to the Cloud:


  • Moving from in-house IT management services to off-site IT management services. 
  • Moving from CAPEX (Capital Expenses) based IT investments to OPEX (Operating Expenses). 
  • Moving from on-premise (business) applications to off-premise (hosted) applications. 
  • Moving from a centralized IT to a democratized IT 

  • I'm sure you can think of other movements as well in your IT environment, but these are typically considered to be not only happening, but also to be disruptive within the enterprise.
    These are in fact not really changes happening due to the Cloud, the Cloud merely gave these movements a boost and fast-tracked the changes in IT.

    Last time, which was a while ago, I wrote about the location of the data center, or rather about where the IT Infrastructure was located. This time around I want to discuss how IT resources find their way to the user, the customer.

    Ever since the beginning of (business) usage of IT within the enterprise there has been a movement from either centralized governance towards decentralized or even democratized and back. But first let me explain what I mean by 'democratized'. It is actually quite simple. Democratized means that everybody and their mother can obtain or have access to IT resources in this case. Consider computers, storage, network access and software applications.
    In the era of mainframes, IT was centralized, with the advent of PC's it got democratized, with client/server architectures it moved back to centralized and with the advent of thin clients, it became even more centralized. But the key here is that we've been at a democratized IT situation a long time ago, when PC's were introduced and software was installed locally on the PC by either a support engineer or the user herself. As long as you could get hold of the installation disks (and of course a valid license) you could install whatever you wanted. This was very beneficiary for the agility of the user but very bad for cost control on IT support expenditure. Because with all that software installed, viruses got installed as well, rendering the PC's unusable and a support engineer had to come over and fix the problem.
    Another important problem with a decentralized and especially a democratized situation with respect to the IT environment is collaboration between users. Apart from the fact that there is in principle no common ground for data, or information exchange if you will, but the diversity of applications in use, similar applications at that, means that exchanging information, actually collaborating is cumbersome to say the least. This resulted in a move towards centralization where PC's are not much more than computers that allow for a user friendly interface on top of a centralized application.

    With the advent of the cloud, and specifically SaaS offerings, the model became intrinsically more centralized, but at the same time, because of the public nature of SaaS offerings, they also became available for anybody with a means to pay for the service, thus democratization entered the enterprise again. With IaaS, but more over with PaaS, the democratization of IT resources extended to not just (business) applications but also towards computing resources and storage. Both Amazon and Microsoft have a ton of additional services on top of their own platforms, provided by both themselves and third parties.
    Everybody with a credit card can get their own enterprise software running in the cloud, create their own development environments or deploy a new marketing website. Typically with better availability promises than their internal IT departments can offer.

    Is this a new way of working, is democratization a new way for enterprises to handle IT? Hardly. So once again, there's nothing disruptive here that is related to the cloud.

    So, why is the cloud really taking off? Why hasn't the hype died yet? Why is the cloud causing IT departments such headaches? In the next, fifth and last installment of this series, I will reveal the true disruptive nature of the cloud. Stay tuned.


    July 1, 2013

    Communicating while working - Message in a bottle... kind of

    Hi fellow architects and other readers,

    Over the coming period I will make an effort to post weekly an article on communication in the digital realm and its implication on architecture and the role of an architect. The articles will be diverse in terms of tone, topic and angle. This was my intention at the time I wrote the first article in a series on January 17th, 2013. This has been weeks, well months, ago and this is only the second installment of a multi-post item on communication. Shame on me.

    Last time I have been discussing my experiences with online chatting from a personal level. It was a historic overview of me using various programs and media to digitally stay in touch with friends and family.

    This time I'm taking it into the office and will discuss the the digital means of communication in the workspace. I talked about VoIP last time as for the consumer it was revolutionary, but in this post I will not touch upon it at all as VoIP in the office is a mere technology to have a phone system in place. Although digital, it is still the traditional way. (Leave your opinions in the comments because there's a lot to say about VoIP in the office not being just a traditional means of communication like the regular phone).

    Ever since I was first connected to a computer network, there has been a form of email system. The digital equivalent of a regular letter. Back in the late eighties early nineties these were proprietary systems, confined to the network you were on. Although there was an open system implemented on UNIX and we used it in university.

    Email has been around forever and it is the primary means of communication between people in the digital realm, well in most cases. Email is the predominant identification of a person on the Internet, it is reasonable to state that everybody with an Internet connection has at least one email address. Many, like myself have multiple addresses. Email is faster than postal mail so it is very convenient to most people. With the advent of broadband, emails have become richer in content as well, and the Internet email protocols have been adopted by all email systems and there are hardly any proprietary systems in use any more, but for very specific situations. Most of these are dealing with specific security circumstances. Email is not secure by any means, it is arguably significantly less secure than regular mail as the email can be opened and read by anybody with sender and receiver ever being aware of this. This is hard to accomplish with an "analog" letter as you would notice the envelope being opened when you receive the letter.

    This security aspect of email has been a concern for many, especially companies that deal with confidential data are extremely aware of this.

    Another important security issue is that it is very simple to send an email pretending to be somebody else.Since the email protocol works on clear text, anybody that can intercept an email can change the email before sending it along its way. And since the premise of the internet is that it is a highly redundant network that can withstand a nuclear attack, anybody can sit in between any two parties that send an email. But more importantly, the internet is a mesh network of point-to-point connections. It's a graph where every connected computer is a node. The connections are know because every node has an address, its IP. And these IP's are structured in a hierarchy.
    On top of that, pretty much all connections are wired connections, because these are reasonably reliable and cheap as well. This also means that continents are connected by, literally, just a few wires. Put your computer on one of these wires and you're in the middle of all continental communications. Including email. Although very simplified, this is actually scary accurate.
    So it's easy to pretend you're somebody else when sending an email. Just as simple as to write another name at the bottom of a letter. And the solution to prevent this is analogous to the analog letter; Signing the letter with a signature that is hard to fake. And here's another analogy with the analog world, how do you know what signature belongs to whom? In the physical world, this is handled by big books with names and signatures and when you get a letter that's signed, you open the book and compare signatures. And this is not a joke, this is how it's done. And in the digital world, we do the same. We have digital books (registries) with the names and the digital signatures that belong to these names and this is how we validate the authenticity of a signature. It's that simple... and really complicated. Because bits are only of the value 0 or 1 and therefore very easy to recreate. Do it in the right order and you can make a perfect copy of a signature. So we've got all kinds of mathematical schemes to ensure that it is as hard as possible to recreate the order in which the bits are written. And we distribute the signatures using a key infrastructure.
    The tricky part of this is that you need to trust the person sending you his signature to be the person he claims to be based on the signature. Consequently this is not a solution to be used on a large scale where nobody knows anybody.

    The key with email is, that its use and its validity is completely based on trust. But there is always plausible denial as an option for the "sender" when he inadvertently sends an email he never wanted to send in first place.

    By the way, this previous part of this post is about half of what non-repudiation is all about. You just can't get non-repudiation without diverting to a small group of people you want to exchange emails with that should not be able to deny to have ever send an email.
    The other half, denying you ever received and opened an email is the other half. This is like registered email (delivery receipt) with or without a signed receipt. Typically only enterprise grade products like Lotus Notes and Microsoft Exchange to name the two biggest email systems for the corporate have the option to automatically reply to the sender that the email was delivered to the recipient's inbox and again when the email was openen. Systems like Hotmail and GMail don't support this. So it's of little use to be honest in today's email eco-system.

    But there's still a valid use of email in the enterprise. It's one of the most efficient ways to inform large groups of people about something that concerns them all. Because everybody is familiar with emails, it's adoption as a means to convey a message is massive. The analogy with mail helps of course.
    The ability to add attachments to an email is of course also of huge benefit. One can send a large document or documents, which could be anything ranging from a text file to pictures or schematics or videos or music tracks, as an attachment where the email body is just an introduction to the real goodies in the attachment.

    The problem with email is actually its wide adoption and its low threshold usability. Resulting in spam, unsolicited marketing garbage that clutters the corporate email inboxes with irrelevant emails that prevent people from doing their jobs. And then there are all the jokes and department party pics that keep people away from their work as well. Due to the little effort it takes to write an email to somebody, it's asynchronous nature, the improper use of any means to make an email more urgent, to raise its importance has caused over the last 5 years or so a transition of the enterprise from email based communications to something else.
    Many enterprises are still searching for a good replacement. With a lack of alternatives to email that have all the good stuff and none of the bad stuff, email is still the prime means of collaboration in enterprises.

    In the late nineties I was working at companies of all sizes where their email systems were limited to the extend of the enterprise and then there was the personal email. With systems like Compuserv and MSN (both were at that time a proprietary alternative to the web) one could send emails to other users outside your own organization. This changed when the internet-bubble started to grow around 1999. Hotmail was the big advocate for internet based email and with websites popping up like corn grains in a popcorn machine wanting to send you information, email grew rapidly and enterprises started to understand the importance of email to communicate with possible customers.
    Interestingly enough, email turned into the most prominent and important way to communicate with the rest of the world along side with websites, but internally neither the intranet nor the corporate email systems took over this role from internal circulations, flyers handed at the door and the surprise brochures you stumbled upon in the morning when getting at your desk. For some reason we still don't see email as a viable means to communicate internal stuff and we still rely on the hard copy of the same message.
    I noted myself that I am more likely to read a piece of paper left at my desk the night before than an email containing the same information left in my inbox around the same time. The reason behind this I don't know. The piece of paper is more intrusive, no question about it. It's typically placed on my keyboard so it prevents me from doing my work. The email in my inbox is easily ignored. It just sits there being unread. Maybe this is why I read the piece of paper, I have to pick it up and put it somewhere else before I can do my job. But still I can move it aside without reading it. So I guess it's more a matter of habit, a piece of paper is to be read. This is what I was brought up with. Books, papers, magazines, flyers, brochures, pamphlets. They are all pieces of papers with words on them, picked up by me to be read. I have to read it, it's the natural course of things. Email is not like that. I am more likely to think an email is too long to be read than a double sided printed memo about the same.

    Based on this, I don't think that email will ever be as effective as paper. Not in the corporate, not to inform people. Yes you can use it very effectively to get your point across, but nothing more than your point. We, the working people, are not yet ready to use an all digital format to inform each other. We're still too analog. And when we're ready, email will not be that format. Why? Because it's too much like mail without an 'e'. What that other format will be? Intranet, document management systems, social media for the enterprise, chat programs? Well, I'll venture into those areas in the next installments of my blog, and I seriously will make it an effort to not wait this long again for my next post.

    Untill my next post...

    Iwan

    Find me on LinkedIn or Twitter

    June 5, 2012

    IT is irrelevant when it comes to business criticality!

    I hope I got your attention, because that is the sole reason for the title of this post. Of course IT is not irrelevant, it is very relevant. But it is not that important as many architects would like you to believe. Well IT architects that is. I would like to believe that your Enterprise Architect and your Business Architect would actually subscribe to this notion.

    So what was I thinking when I decided that IT has no relevance when it comes to business criticality? You may actually ask yourself what I was smoking or sniffing when I started this post. Well the sheer fact that businesses where there long before the arrival of IT and most of our business processes today are still very well possible to be executed without IT. Probably not as efficient, after all that is why we have IT, but they can be done nevertheless.
    Communication on the other hand is very crucial and having a means of communication is critical for business continuity. The more efficient we communicate, the more efficient we can do our work. This is why you always see that the most effort in IT has always been in the area of getting the right information at the right time at its destination.
    And this is where my post starts to make sense, well it's the start. When we talk about information getting at the right time at the right place, we are talking about defining the business process, after all this is what a business process is all about; Moving information from one person to another, from one system to another. With every person, every system doing its magic with that information. Transforming it, enriching it, creating new information from it.
    Back in the days we didn't have computers, we did have processes. And we had technologies to move information from one person to another. By couriers, Pony-Express, tube-systems, etc. Than with the advent of machines and later computers we started to make some steps in the process more efficient by automating them. Most of the processes stayed, in essence, the same, but we made them faster so we could produce more of the same. Make more profit and keep shareholders happier. But since the processes are still the same, we can replace the automated steps with the original manual steps again. Without compromising our business other than not being able to produce as much. So IT is irrelevant for the operation of the business unless you want to make money that is.

    But here's the deal, when it comes to the operation of a business, the continuation of the business, the success of the business. IT is only a tool. It is the constellation of processes that defines the business, that ensures business operation, allows for continuity and determines the success of the business. Probably IT has a strong impact on the success of the business, but unless the process is clearly defined, the costs of performing each step in the process is known (to a degree) and we know what the cost of an automated variant of that step would cost, there's no reason to automated. In development countries, the emerging markets, manual labour is relatively cheap. This is why you'll see 10 men digging a canal instead of one man using a bulldozer. Because its cheaper, more cost efficient. Even in the western world, where manual labour is relatively expensive, automated solutions are often more expensive. Work is being automated or machine aided because time is a constraint, scalability is an issue or the labour is too hard and machines more reliable. But automation is and should always be a conscious decision.

    With today's state of technology, we can automated pretty much everything and more importantly we can automate the governance of our business processes. This means that we can control that very essence of our enterprise, the process and keep tabs of what is happening when where by who and why, all in real time. Allowing us to make split second decisions when they are needed but more importantly ensuring that the the right information will arrive at the right time with the right person or system. Alerting us when this is not the case, telling us why it is not the case, keeping track of how often this is not the case. We call this Business Process Orchestration and in my humble opinion it is the going to be the area in IT that will garner the biggest cloud.

    An important aspect op Business Process Orchestration (BPO) is that it allows us to unambiguously define business processes in a way that is almost technology agnostic, meaning that we can define he process without too much information as to how steps in the process are implemented. Of course to make the process track and traceable, we need to know the implementation and in a true BPO governed environment we have to go to that level of detail, but think about this; when we define our process at a high level such that it is almost irrelevant how the steps in the process are executed we can use this definition in various contexts.

    Which is important why?
    Remember I stated that the essence of an enterprise are it's processes, this means that in order to make the enterprise more profitable it is required to make the processes more efficient. And in that same train of thought, when the health of the enterprise is in jeopardy, it is important to make the failing processes more healthy. This can only be done when you control the processes and are able to manage them. This is exactly what BPO intends to do for us.
    Now think of a scenario in which the health of the enterprise is pretty awesome but all of a sudden the enterprise finds itself close to dying. A serious disaster has struck the enterprise, for example an earth quake hitting the data center, a tsunami has taken its toll or a revolution has overthrown the stability of the country. In this case you're faced with a situation in which the business continuity of the enterprise is at risk. You're required to ensure that those parts of the business that are crucial for its continuity are taken care of and running as always. Which from an IT perspective means those systems that are crucial need to be up and running, most likely at an alternative site. And here's the multi-million dollar question, literally, what are the crucial systems?
    In the old days this was simple, the was only one system, the mainframe, you needed 2 of them to be ready for whatever disaster. Nowadays this is not that simple. Systems depend on each other, have different designations and different technologies. More uses and a variety of uses as well. So how do you determine what's crucial and what's not? Business processes to the rescue.
    Look at the processes that are crucial, the systems used in those are crucial. Look at the costs of making these redundant and see if the 'manual' option is viable as well. For those systems where there can not be a manual alternative you spend the dollars to make them disaster recoverable. While doing so, don't forget to keep in mind the RTO ( Recovery Time Objective) and RPO (Recovery Point Objective) of these systems and obviously also the Confidentiality, Integrity and Availability (CIA) rating of the required information.

    As you should understand by now, the business process is key in all of this. And there is no point in spending time and money on IT without knowing what processes are being served and how crucial these are.

    As always, I'm more than happy to read your comments and learn from them. Happy to feedback as well on any of your questions and maybe other may answer them as well.

    Iwan

    November 10, 2011

    The Relevance of Data Governance; Why the 'I' is more Important than the 'T'

    It is always interesting to see that in most enterprises the T in IT is considered most important or at least it gets the better part of the IT department’s budget. This is probably due to the fact that the T is always changing. In fact in order to keep the competitive edge, enterprises need to adopt new technologies, they need to embrace new technologies. Competitive edge becomes cutting edge becomes bleeding edge.
    In addition, enterprise need to keep investing in the T because their T vendors have clearly defined end-of-life dates and when the T is not supported, the enterprise is risking its continuity.
    The T is a moving target and requires a constant stream of budget in order for the enterprise to keep up with the competition. In order to control the constant stream of budget and being able to manage the changes in the enterprise due to new and improved T’s we define projects to migrate from one T to another, to stay up to date with our T and to assess our maturity with respect to the T.



    I think most people understand the T more than the I as well.

    In many enterprises the T is treated as an asset, it is considered a business’ capital, where it is only a tool. But we forget about this. Because it is so dynamic, so hard to capture, to restrain and harness it keeps us preoccupied and we forget about the I in IT. Whereas the I is the asset. Without the I there would be no IT.
    More interestingly, the I should typically not change at all, there is no end-of-life for the I. Where the T will have to change constantly, the I cannot be allowed to change. Once it is fixed and considered correct, we need to keep it fixed in order to keep it being correct. And where we do whatever we can to change the T, where we spend as much as possible to change the T to keep up with the competition, we spend close to nothing to maintain the I, to keep the I stable, fixed and correct. All the new T we introduce in our enterprise, is directly or indirectly harming the integrity of the I, yet we fail to prevent the T from impacting the I. The T is no longer there to improve the I, to enrich the I, to make the I the added value of the enterprise. In many enterprises the T transcended from the means to the end.
    We have become so dependent on the T that every change in the T is controlled, the T is governed by processes, forms, people and standards in varying orders of importance depending on the maturity of the enterprise.

    It is ironic that the T was once introduced for the I to benefit from. Enterprises were all about the I in those days, but the T took over, it’s dynamic nature made it more interesting, less boring than the I. Its dynamic nature made us govern its changes. It became our object of expenditure. The I was forgotten, a second class citizen in the enterprise. But it is the dynamic nature of the T that makes it less of an asset in the enterprise, every investment in the T is by its very nature not a long term investment. All of a sudden a short ROI is important for every change in the T, because it is not an asset. The T is an opportunity, one that depreciates more and more rapidly as we are more and more dependent on the T to keep the competitive edge.
    The I is hardly changing at all. It is growing in size and forms, but once it’s here, it’s here to stay. The I never changes, it’s location may change but the I itself doesn’t change, the I therefore is an asset must be considered an asset. The I is long term planning and as with all long term planning in order to make sense, in order to be fruitful it requires to be well thought through, to be looked at from any angle and it needs to be handled with great care. It needs to be governed.

    Back to reality.

    I've been most of my professional life contracted or employed by financial institutions and I think that over the last 2000 years the way banks operate hasn’t drastically changed. The advancements in technologies has made banking more efficient and has provided us with more and more ways to get in touch with customers but the essence of banking hasn’t changed. People trust banks with their money and they trust others with this money. Banks have invented interest and interest rates in order to implement a viable business.
    When one looks closely at the different banks all over the globe they mainly differentiate by the level of service offered to their customers and where it is relevant the reputation of the bank in the market it operates in, be it a geographical market or a business market.
    The advancements in technology in the last decade have turned financial systems into commodities, where software companies can develop generic solutions for financial markets in a far more cost effective way than these financial institutions can. Due to this commoditization of technology in the financial sector, it becomes more and more difficult for enterprises in this market to differentiate by the level of service without exclusivity. Nowadays every bank has an online presence and the maturity of the bank defines the diversity of its online presence (internet, phone, mobile, smart-phone apps, etc). Because of this, the technology they have at their disposal does no longer provide a means to differentiate them from their competition. It no longer can provide them with a competitive edge, it can merely streamline their processes and polish their image.
    These days they have to differentiate from competitors by the amount of information they have about their customers in the context of the customer’s world. The customer wants to know his complete financial situation. He wants to know his exact position, but also the bandwidth of his credit. And he also wants to know this in the context of his current situation. For example, he is abroad and needs to pay a hotel bill, he will not be interested in the amount of money in his savings account, but he will be interested in the limits of his credit cards. But when this same customer is back from vacation he will be less interested in his credit limits, but will be more interested in his expenses for each of his methods of payments. He will be interested in what is paid by which credit card, what is paid for by debit card, when and where does he use an ATM and how much did he take from the ATM. He will want to know what bills are being paid and how much each bill was. What are recurring payments and is there in trend in the amounts of these payments.
    It is not technology that he wants from his bank, but information. As a bank, people traditionally trust them with their money and in an ever more digitized world, they trust banks with their data about their financial situation. It is up to banks to turn this data into relevant information for their customers. In order to do this efficiently and to provide the services to their customers efficiently they need technology. The competitive edge is in the fact that a bank can turn this data into information for their customers that is relevant in different situations.
    Meanwhile, the same data is used to streamline the internal processes and allow for better analysis of the financial markets and the general business a bank is in. Customer data is the micro level information that is important in understanding macro level economies as long as it is in abundance. After all the financial business is more than anything else a business of people and their trust. And accurate information is the cornerstone of this trust.