End of year blog: Reload before the Revolution begins
The tagline of the Wachowski’s second Matrix film might be an appropriate description for 2018 in IT terms. Trends that were hyped and ‘expected to boom’ this year proved relatively detained in the eye of the average organization. But if that is the case; why so? Has there been no significant development at all? Time to have a comprehensive look at the ‘Year of the Dog’ in retrospect.
As we approach the end of 2018, renowned industry researchers such as Gartner, IDC and Forrester present their expectations for the new year, with many projecting big leaps for IoT, Edge Computing, AI and Blockchain. In this end-of-year blog, we will discuss some of these exciting developments from a more pragmatic point of view; how they might impact us as consultants and our customers in the near future. I (LvdG) will be joined by managing consultant Dirk de Vries (DV), who will elaborate on the concept of AI as well as SAP’s ambitions with the technology, and data architect Rogier van Royen (RvR), who will share his knowledge on the widespread adoption of new technologies and explains why this process takes more time than enthusiasts initially anticipated. Let us dive into it!
Preamble: why 2018 was a detained year for IT trends
RvR: When predicting change, people tend to overestimate what happens in one year, but underestimate what will happen over a ten-year period. It is much like turning an aircraft carrier; initially it is often slower than you would think, but once it picks up the pace, it suddenly turns very fast. Innovation and change are used to tackle problems one has (or believes he has). The explicit starting point is one’s own mental framework. It takes a new generation to look at the same problem agnostically and find solutions outside of the beaten path. To illustrate further, a nice practical example:
The modern car was invented around 1885. The first models looked a lot like a chariot equipped with a (small) engine. By 1910, cars had evolved a lot, and did not look like a chariot anymore… However, the driver was still sitting outside of the cabin. The driver had always been sitting outside to be able to yell at the horses. This was a ‘solved problem’ so no one thought of changing this design pattern for 25 years. It takes a generational shift to really think out of the box. Experience comes with blinkers that focus but limit one’s sight.
In the same way, companies tend to use IT to solve problems they encounter or optimize their processes. Sometimes however, these processes or problems would not exist if one would have in mind the available technologies when setting up and organizing the company. Large organizations spend a lot of energy to optimize Backoffice processes. But a lot of Backoffice processes could be obsolete if the company had been designed with IT solutions in mind. Most start-ups for example do not have a traditional Backoffice.
LvdG: Also, one should remember that people (and thus organizations as well) are naturally risk-averse and resistant to change. This rigidity is difficult to overcome and can stand in the way of the adoption of new technology, often without sound arguments. Again, companies without a (long) history, such as start-ups, have little or even none of this rigidity which makes them better suited to accommodate new ideas and practices. In today’s fast-paced economy, this is more relevant than ever.
Now that we have discussed some reasons why this year has been an evolution rather than a revolution, let’s take a closer look at the actual technologies that were subject to these forces in 2018.
LvdG: Despite the shrinking Crypto Currency bubble, Blockchain received increased understanding across a select variety of potential users but is still perceived as too complex for general adoption. While tech-savvy companies and freshly-determined start-ups are publishing the first Blockchain-based applications (or Proof of Concepts), the technology is still somewhat vague to the mainstream. This perception prevented it from shaking off its initial Crypto-image, even though Blockchain is far more than just a digital currency market; it was just not always interpreted that way, hence why short-term expectations were completely out of proportion.
RvR: Technologies like Blockchain and Artificial Intelligence have too often been used to pimp up PowerPoint decks. There has been insufficient focus on the value behind it. But we’ve clearly reached a tipping point. This year was indeed a lot quieter than most analysts expected, but I do believe the development of these technologies will gain momentum during the next few years.
LvdG: Indeed, I also expect 2019 to bring about a change in said paradigm, with the emergence of smart contracts as the heralds of the decentralized economy and the concept of security tokens that rise from the ashes of the Crypto-hype. Although especially the latter will require the parallel development of a supporting infrastructure (i.e. platform), Blockchain appears to have passed its introductory phase.
LvdG: Another important technology that has been in the spotlight for some time now, is Artificial Intelligence (AI). Generally, AI was not able to live up to the, admittedly high, expectations this year, as companies across the globe struggled with the effective implementation of chatbots and smart robotics. Furthermore, setbacks with respect to self-driving cars have put the technology in a rather unfortunate spotlight towards the general public.
RvR: Still, large amounts of data are increasingly available. It is ever more difficult to combine this data for traditional human analysis. Mathematical models and new forms of data representation are used to transform data into usable information.
LvdG: While that is true, projections by Forbes and IDC state that AI adoption within governments is still more extensive than it is within civil corporations and that this trend will continue in 2019. With society still rather careful to adopt AI for day-to-day operations, the new year is set to continue the current trend of (incremental) evolution; although that is not to say that AI is any less an important development. This is especially evident in the way that software vendors such as SAP aim to deliver AI applications to their customers.
DV: AI becomes more part of the standard way of working and will eliminate boring activities for the workforce. The “System” will do all what can be done automatically, and the human will do what is in his or her strengths: being creative and handling exceptions. As a leading vendor of Intelligent Enterprise software, SAP has built a lot of standard processes that make use of AI. Those processes make decisions based on past decisions which were made by humans and give advice in cases that a validity score is not high enough to make the definite decision. Examples of this kind of operational AI are Invoice matching, Automated procurement and so on. By utilizing algorithms and neural networks, systems can do more (24/7 available) and better work than humans, because they make use of the knowledge of humans, which is stored in the database as historical transactions.
RvR: The next step is to have these algorithms propose new combinations and insights. Neural networks and genetic algorithms make it possible to have the computer look for solutions without having to teach it how to do this, but rather telling it when something is a success or a failure. Neural networks are based on a very large amount of functions, each accepting a massive set of input parameters. Each function will generate a simple answer. The footprint of all these answers will determine the outcome of the algorithm. Neural networks are, for example, used for image recognition and pattern detection. Feedback loops enable the algorithms to learn and improve over time. Large tech companies (like Google and Facebook) have used these algorithms for years and their libraries are freely available. The adoption of these algorithms in mainstream solutions is steadily growing. With the rise of Quantum computing, and the integration of hardware acceleration (using FPGA or dedicated ASICS), these capabilities will grow in the coming years in an exponential way.
LvdG: The fact that these algorithmic advancements will allow systems to take over mundane activities from humans is especially true, and not all that far in the future. SAP predicts that 60 percent of human tasks will be automated in 2025 already. That might sound far away, but try and picture over half the tasks that you see people do in your everyday life being done by a computer in less than seven years; it’s stunning! Our working society is already changing in order to accommodate for these AI applications, and SAP projects that ‘Intelligent Enterprises’ will play a big role in this.
DV: The intelligent Enterprise as described by SAP features 3 base components: Intelligent Suite, Intelligent Technologies and the Digital platform:
- The Intelligent Suite helps you manage customers, supply chain networks, employees and core processes. Those functionalities are available in SAPs products like S/4HANA, C/4HANA and SAP Ariba.
- The Intelligent Technologies drive innovations, they can give a company the required competitive power or can form the basis of a disruption. Examples are Artificial Intelligence, Blockchain, Data integration, Big data, IoT and Analytics.
- The Digital platform lets you manage data from any source in any format and rapidly develop, integrate and extent business applications. It consists of two core technologies:
- SAP HANA data management suite.
- SAP Cloud platform.
To become an Intelligent Enterprise, one can choose from the following three paths, or combine them if preferred:
- Optimize your processes for more efficiency and reliability
- Extend your current process to capture new sources of value
- Transform your value chain or business model.
Integrating these modern technologies in a process can disrupt a whole market, as activities or processes that were not handled before because of cost inefficiency can now be handled because a computer can operate those activities in a cost-effective manner. For companies that outsourced their repetitive tasks to a country with lower labor costs, this can then mean that even back shoring can be considered.
Data storage and consumption
RvR: Data is increasingly important for companies. Both internal, to optimize processes as external and to react faster to changing situations. This is referred to as the concept of ‘data driven organizations’. Each company is in some way impacted by this evolution.
LvdG: With the ever-increasing amount of data and both the possibilities and requirements that come with that data, organizations are forced to think about their information governance in different ways. Data warehouses turn into data lakes in order to accommodate (un)structured data, and in turn lakes transform into reservoirs to allow for more control and transparency. Legislation and advanced analytics almost demand companies to invest not only in efficient storage, such as Solid-State Drive (SSD) technology, but also in reliable backup systems. If anything, 2018 has seen many high-level data breaches with both governments and companies such as Facebook (Cambridge Analytica) that emphasize this development. Hence, data will need to be handled with more care and from different perspectives.
RvR: A new breed of companies has emerged, providing data services and insights. They focus on scraping data from different sources, and correlating this into something that is of value for their customers. They act therefore as data hubs and offer analytics and/or streams (push services). This means that traditional companies can keep their eye on the business. They can rely on these data hubs to provide them with the data they need to act and react on a dynamic marketplace. They do need to integrate their information systems and decision processes with these data hubs. It also means that a differentiating factor between companies (to keep a competitive advantage) lays in the ability to combine internally the information flows of different data hubs. How can the information coming from different sources be combined in a coherent and streamlined way? And how can this information be used to develop a successful strategy and adapt it when needed?
Cloud to (the) Edge computing
RvR: The cloud to edge computing trend is a natural evolution of what has been happening the past years. Increased computing power across devices made it possible to shift business and application logic towards the user, or edge. At first, it was used to improve the UX experience, but soon JS frameworks (like Angular and nodeJS) made it possible to shift more and more complex logic. These frameworks quickly evolved to support different front-end devices. Now different UXs can contain the same code and business logic is moved to the device in a maintainable way. This has greatly improved the user interaction, but it does require new app development paradigms and it changes the general architecture.
LvdG: Additionally, the move of big data to the Cloud was hampered by the ever-present dilemma of ‘public’, ‘private’ or ‘hybrid’; not to mention the newly coerced GDPR, to which many customer-driven organizations still not conform. After being beaten to the punch by IBM and Microsoft, Amazon and Google have been gearing up to release their own version of hybrid-cloud offerings, pulling the concept firmly in the mainstream saddle. This brings us back to the expected advance of edge computing, which, like Rogier already stated, will make companies realize that it is much more efficient to analyse data in a decentralized manner as opposed to bringing it all back to a single core.
RvR: The coming year, this tendency will accelerate. Software suppliers will more and more need to choose to focus either on providing the core functionality with API connectivity or on user adoption, competing for the user’s attention: application backends focus more on sophisticated ways to provide their core functionalities, exposing them through a set of APIs. Through API management, an application can now completely be decoupled from its user representation. An application can permit different competing front-ends to fully use its functions. This improves the userbase and can enable new business models (e.g. creating API subscriptions).
LvdG: The million-dollar question here is literally; who is going to pay for it? After all, the central idea of edge computing will also require quite powerful hardware and lots of bandwidth at places where it is not (yet) widely available, such as cars, tablets, or smartphones. This is another topic that will become clearer as we stroll into the new year… Questions like these and many more rise from the horizon as we get closer and closer to the next generation society. Whether we call it smart, connected, distributed, edge-computed or perhaps automated; technologies that have been slowly building below the surface are primed to make their debut. Is 2019 going to be their ‘internet’ year? Only time will tell.