Data is changing the paradigm in everything from business to social interactions. Here is what will shape the data landscape for the years to come. By George Anadiotisfor Big on Data
The original roaring 20s were 100 years ago, but we may be about to see a new version of this... And if it's going to be "a period of economic prosperity with a distinctive cultural edge," then both the economic and the cultural aspects will be all about data.
Data is shaping a new culture, bringing about a new way of doing business, a new way of decision making, new applications and infrastructure, and is an enabler for the transition to AI. Data is the focal point of our coverage on Big on Data, so following up on Andrew Brust and Tony Baer's predictions, here's our own round of things to keep an eye on in the 2020s.
This is the first part -- including trends 5, 4, and 3. The second part -- including trends 2, 1, and a bonus -- will follow suit in a few days.
5. It's the end of blockchain as we know it, and I feel fine
Immutability, where written data is "forever" tamper-resistant. The ability to create and transfer assets on the network, without reliance on a central entity, based on a decentralized consensus mechanism. This is what powered Bitcoin, which opened up the realm of possibilities, but also spurred an immense wave of speculation, ignorance, and fraud.
Today, blockchain seems to have hit rock bottom. Gartner is placing blockchain at the bottom of the Trough of Disillusionment. Scams such as Onecoin and Bitfinex - Tether exemplify the Wild West that blockchain has become, and Facebook's Libra is not getting traction either -- which is a good thing.
Blockchain will have a transformational impact across industries in five to 10 years.
Even leaving speculators out of the picture. However, there still are technical issues to resolve. A database with poor performance, inability to interface with the outside world, and no query language to speak of is not exactly a solid substrate for applications. This is where blockchain is today. But here comes the good news: Yes, blockchain will have a transformational impact across industries in five to ten years.
Although enterprise blockchain implementations do exist -- check JP Morgan's Quorum or Oracle'blockchain Platform -- the blockchain landscape in the 2020s will probably be very different. It may not even go by the same name: Distributed Ledger Technology is a better match for a technology, which may not really be a blockchain anymore. DLTs will be front and center in the 2020s.
4. On cloud No. 9... or, on how many clouds actually?
This one should be rather familiar. The gradual move of applications and data to the cloud has been a recurring theme on Big on Data. A decade ago, for most businesses, the cloud was not really an option for data and compute at scale. Hadoop was all the rage - a synonym for Big Data, for all intents and purposes. Today's world is very different.
In addition, cloud storage, which is where it all started, is becoming more sophisticated, too. Amazon Athena has added SQL query capabilities on top of AWS S3. The open-source Delta Lake project unifies cloud storage and data warehouses. So cloud storage looks more like a database, and cloud providers offer data management solutions, too.
Cloud adoption is growing.
(Image: maxsattana, Getty Images/iStockphoto)
But mobility goes both ways. Every database today has a managed offering in the cloud or is in the process of getting one. With databases getting increasingly complex to manage, and the ability to get on-demand storage and compute for their needs offering major advantages, managed databases in the cloud are a natural evolution.
Today, when choosing a data management solution, the shortlist almost always includes an offering by a cloud vendor. Being cloud-native, managed, and billed via a single control plane, and having multi-region availability makes cloud vendor offerings attractive. On the other hand, those offerings are not multi-cloud and are not always the best of the breed.
In other words, open-source is winning, in databases and beyond. There are some very good reasons why this is happening: low barrier to entry, community, innovation, interoperability. But the fact that open source is becoming the norm in enterprise software has side effects, too. To put it simply: AWS is eating open-source software because it can.
Monetizing open source is complicated, but getting a viable open source business model in place is vital for open source sustainability.
(Image: 451 Group)
This is really about much more than databases and vendors. What this really is about is business models around shared resources and fair contribution and reward around those shared resources. Open-source software is free as in speech, but not free as in beer. Someone has to build the software, and then someone has to maintain, run, and manage it.
So, it all comes down to how much each actor gives and takes, and whether this should somehow be accounted for. Clearly, this is a much broader topic than what we could possibly address here, so a piece crystallizing thoughts and exchanges on the topic is due. In the meanwhile, let us be reminded of another example of a "free" shared resource -- the web.
Much of the trouble with the web and the monopolies built around it today stem from the failure to operate a workable business model around it. In its absence, the Googles and Facebooks of the world have been eager to step up and fill that void, dominating the web and installing ad-based empires in the process. Making the same mistake twice would not be wise, and this deserves to be a top priority for the 2020s.
According to the latest research from IDG, 90% of organizations expect to have workloads deployed in the cloud by the end of 2019. Of those, 42% anticipate a multi-cloud environment comprised of a combination of public, private, and private/hosted clouds with a provider.
Statistics like these suggest we’re way past the tipping point for cloud computing and fast-approaching an era in which Infrastructure as a Service (IaaS) becomes the norm. But if you’re still not sure the cloud is right for your workloads, the following nine reasons our customers choose cloud computing may help. We’ve even thrown in a few tips and resources that can help you maximize your return.
Why Businesses Choose Cloud Computing
1. Cost Effectiveness – Virtualized resources remove the capital expense of procuring and maintaining equipment as well as the expense of maintaining an on-premises data center: cooling, physical security, janitorial services, etc.
In addition, cloud service providers deliver economies of scale and expertise for a faster return on investment (ROI). For example, did you know that data centers already consume more than 2% of the electricity consumed in the US? All of that power requires significant cooling. Larger data center operators have the bandwidth to invest in high-efficiency cooling systems that reduce power consumption and costs.
2. Speed to Market – These days, almost every organization is doing some sort of software development designed to enhance market position. Cloud computing allows the organization to quickly provision resources for development and testing across a number of different types of environments.
Once these applications are ready for roll out, developers can quickly transition code to a live production environment for a smoother product launch. Because these environments are highly scalable (see Reason #3), the organization doesn’t need to worry about inaccurately estimating capacity requirements.
Working with a managed service provider like TierPoint to manage your live environment can also help you maximize uptime and performance for greater customer satisfaction. And while we’re focusing on your infrastructure, you can be focusing on creating “killer apps” for your customers.
3. Scalability – Estimating data center capacity requirements is one of the most difficult tasks an IT leader faces. Over-estimate and you end up sinking capital into capacity you don’t need. Underestimate and you end up crippling the business’s ability to respond to opportunity.
Cloud computing resources (compute, cloud storage, and network bandwidth) can be scaled up, down, or off to meet your current needs. Public clouds like AWS and Azure even provide for elastic computing, which automatically expands compute and storage to fit unanticipated capacity requirements.
4. Increased Productivity – When your team doesn’t need to spend time maintaining equipment, they can focus on higher-value add activities like IT security and data analytics. Combine cloud computing with managed services, and you can get even more done by offloading tasks for which your team isn’t suited or for which you don’t have the bandwidth.
5. Innovation – Have a project on your to-do list that you never seem to get to because you’re too busy managing your infrastructure and fighting fires? By offloading infrastructure management and other time-consuming tasks to a managed service provider like TierPoint,your team can focus on the innovations that drive the business forward.
6. Improved Performance – High-performance computing, fast communication networks, and local edge computing are among the many ways cloud computing reduces latency and improves performance.
In addition, data center equipment can quickly become obsolete, but many organizations try to get every last dollar they can out of their investment before they upgrade. Third-party data center providers like TierPoint typically follow a hardware refresh cycle that is far shorter than that practiced by on-premises data center operators. Newer hardware frequently means higher performance.
7. Data Security – Third-party cloud providers like Amazon, Microsoft, and TierPoint have the bandwidth to focus on IT security 24/7/365. We’ve got our eye on emerging threats and the technologies used to combat them. We also have the resources to hire high-demand cloud security professionals and the expertise to implement best-practice controls and policies for cloud security. Most on-premises data center operators can’t afford to expend this kind of effort on their in-house IT security efforts. Read our Strategic Guide to IT Security to learn more about cybersecurity and IT security fundamentals.
8. Availability – If you’re managing an on-premises data center, chances are you vividly remember the last time you experienced planned or unplanned downtime. It’s that painful. Well-established third-party cloud providers have the resources to invest in redundant infrastructure, UPS systems, environmental controls, network carriers, power sources, etc., to ensure maximum uptime.
9. Resiliency – Before cloud computing, enterprises needed to invest in redundant infrastructure to protect themselves in the event of a disaster. Cloud computing makes disaster recovery far more cost-efficient and effective by enabling replication and failover to an alternate location in the cloud.
Cloud computing also gives you the flexibility to choose the failover location and model that optimizes RTO (recovery time objectives), RPO (recovery point objectives), and cost for each workload. For example, many organizations are incorporating public clouds, like AWS and Azure, as a component of their disaster recovery plans for faster, more cost-effective recovery solutions.
Unfortunately, too many organizations don’t have the time, expertise, or bandwidth to do adequate disaster recovery planning. A Disaster Recovery as a Service (DRaaS) solution can help ensure you have the bases covered.
Thinking of Migrating Your Workloads to the Cloud?
The benefits of cloud computing are many but maximizing the return on your investment takes careful planning. A well-thought-out cloud migration plan can minimize downtime, ensure the safety of your data in transit, and help you maintain compliance and security throughout the process.
Analysts Will Discuss Cloud Security at the Gartner Security & Risk Management Summits in London and Dubai
Rapid growth in cloud adoption is driving increased interest in securing data, applications and workloads that now exist in a cloud computing environment. The Gartner, Inc. Hype Cycle for Cloud Security helps security professionals understand which technologies are ready for mainstream use, and which are still years away from productive deployments for most organizations (see Figure 1.)
"Security continues to be the most commonly cited reason for avoiding the use of public cloud," said Jay Heiser, research vice president at Gartner. "Yet paradoxically, the organizations already using the public cloud consider security to be one of the primary benefits."
The attack resistance of the majority of cloud service providers has not proven to be a major weakness so far, but customers of these services may not know how to use them securely. "The Hype Cycle can help cybersecurity professionals identify the most important new mechanisms to help their organizations make controlled, compliant and economical use of the public cloud," added Mr. Heiser.
Figure 1. Hype Cycle for Cloud Security, 2017
Source: Gartner (September 2017)
At the PeakThe peak of inflated expectations is a phase of overenthusiasm and unrealistic projections, where the hype is not matched by successful deployments in mainstream use. This year the technologies at the peak include data loss protection for mobile devices, key management as-a-service and software-defined perimeter. Gartner expects all of these technologies will take at least five years to reach productive mainstream adoption.
In the Trough When a technology does not live up to the hype of the peak of inflated expectations, it becomes unfashionable and moves along the cycle to the trough of disillusionment. There are two technologies in this section that Gartner expects to achieve mainstream adoption in the next two years:
Disaster recovery as a service (DRaaS)is in the early stages of maturity, with around 20-50 percent market penetration. Early adopters are typically smaller organizations with fewer than 100 employees, which lacked a recovery data center, experienced IT staff and specialized skills needed to manage a DR program on their own.
Private cloud computing is used when organizations want to the benefits of public cloud — such as IT agility to drive business value and growth — but aren’t able to find cloud services that meet their needs in terms of regulatory requirements, functionality or intellectual property protection. The use of third-party specialists for building private clouds is growing rapidly because the cost and complexity of building a true private cloud can be high.
On the SlopeThe slope of enlightenment is where experimentation and hard work with new technologies are beginning to pay off in an increasingly diverse range of organizations. There are currently two technologies on the slope that Gartner expects to fully mature within the next two years:
Data loss protection (DLP)is perceived as an effective way to prevent accidental disclosure of regulated information and intellectual property. In practice, it has proved more useful in helping identify undocumented or broken business processes that lead to accidental data disclosures, and providing education on policies and procedures. Organizations with realistic expectations find this technology significantly reduces unintentional leakage of sensitive data. It is relatively easy, however, for a determined insider or motivated outsider to circumvent.
Infrastructure as a service (IaaS) container encryption is a way for organizations to protect their data held with cloud providers. It’s a similar approach to encrypting a hard drive on a laptop, but it is applied to the data from an entire process or application held in the cloud. This is likely to become an expected feature offered by a cloud provider and indeed Amazon already provides its own free offering, while Microsoft supports free BitLocker and DMcrypt tools for Linux.
Reached the PlateauFour technologies have reached the plateau of productivity, meaning the real-world benefits of the technology have been demonstrated and accepted. Tokenization, high-assurance hypervisors and application security as a service have all moved up to the plateau, joining identity-proofing services which was the only entrant remaining from last year’s plateau.
"Understanding the relative maturity and effectiveness of new cloud security technologies and services will help security professionals reorient their role toward business enablement," said Mr. Heiser. "This means helping an organization’s IT users to procure, access and manage cloud services for their own needs in a secure and efficient way."
Gartner clients can read full analysis on the technologies in this Hype Cycle in "Hype Cycle for Cloud Security, 2017." This research is part of the Gartner Trend Insight Report "2017 Hype Cycles Highlight Enterprise and Ecosystem Digital Disruptions." With over 1,800 profiles of technologies, services and disciplines spanning over 100 Hype Cycles focused on a diversity of regions, industries and roles, this Trend Insight Report is designed to help CIOs and IT leaders respond to the opportunities and threats affecting their businesses, take the lead in technology-enabled business innovations and help their organizations define an effective digital business strategy.
Gartner analysts will provide additional analysis on IT security trends at the Gartner Security & Risk Management Summits 2017 taking place in London; and Dubai. Follow news and updates from the events on Twitter at #GartnerSEC.