As Cloud Computing becomes increasingly prevalent across industries, an increasing amount of people find themselves exposed to this set of technologies. For instance, Gartner estimates that by 2024 more than 45% of IT spending on infrastructure will be allocated to cloud. Moreover, cloud technologies are a particularly good fit for some of the major challenges modern companies have to face:
As data keeps growing exponentially, the share of structured productivity data in the overall data mix kept decreasing over the past decades. Increasing the need for working with unstructured data and processing data closer to the source
There is value for businesses in investing in IoT (Internet of Things) and capitalizing on resulting data streams. Which emphasizes the need for tools capable of processing large amounts of data in real-time or near time
Covid-19 crisis sharpened the need for companies to increase their efficiency and cost-effectiveness. Which can be achieved by increasing automation level and developing new ways of serving customers
The purpose of this article is to introduce the concepts of Cloud Computing and show some Cloud Architecture examples, as well as explaining some of the main differences between traditional solutions and cloud-based solutions. Next, three sections will be dedicated to approach some typical uses cases of cloud architecture examples in Azure:
Cloud architecture example 1: Building an enterprise data hub
Cloud architecture example 2: Reporting on data
Cloud architecture example 3: Supporting business with data
Before diving into what cloud architecture is, it is important to first understand what cloud computing is. Cloud computing refers to computing services provisioned by a third-party provider, and accessible over the Internet, i.e. the Cloud. Such services include compute, storage, networking, intelligence, etc.
Moreover, one can distinguish between three types of cloud computing:
Public Cloud – the cloud computing resources are owned by a third-party provider and the underlying hardware is shared among the provider’s customers. In this scenario, the customer emphasizes on cost reduction
Private Cloud – the cloud computing resources are owned by a third-party provider and the underlying hardware is exclusively reserved for a single customer. Alternatively, the computing resources are directly owned by the company, but located in a data center outside of the company’s firewall. In this scenario, the customer emphasizes on control
Hybrid Cloud – this is a scenario in which the company combines both Private and Public cloud resources.
As for cloud architecture, this concept refers to the integration of cloud computing services into environments of interconnected services that form an online platform on which applications can run. All in all, one could think of this as the process of manufacturing a car where different pieces (cloud services) are assembled following a blueprint (cloud architecture) to build the car (cloud platform/application). Another aspect of cloud architecture worth mentioning, is the service tiers that are available:
Infrastructure as a Service (IaaS) tier – renting IT infrastructure from a provider. In this scenario, the customer is still responsible for setting up the components and managing the environment except that the underlying physical assets are owned by the provider. This service tier is useful when the customer requires very tight control over the resources deployed on the Cloud. Examples of such services on Azure are Virtual Machines, Storage Accounts, Virtual Networks, etc.
Platform as a Service (PaaS) tier – renting an on-demand environment for creating and managing applications. In this scenario, the customer is responsible for managing the platform without taking care of the underlying infrastructure. This service tier is useful for application development. Examples of such services on Azure are Data Factories, SQL Databases, App Services, Machine Learning Services, etc.
Software as a Service (SaaS) tier – renting software applications over the Internet. In this scenario, the provider hosts and manages the software application, the underlying infrastructure, and handles any maintenance whereas the clients access the service on-demand. Examples of such services on Azure are Microsoft Office 365, Power BI, etc.
Lastly, when compared to solutions based on on-premise resources, cloud solutions have several attributes that will have an impact on their design. Here is a summary of the main ones:
Cloud based architecture abstracts the physical assets layer of any solution – no matter the tier of cloud services being used, the cloud provider will be responsible for provisioning and maintaining the physical assets required for the service to run, i.e. storage, servers, machines, etc. This means that responsibilities of ICT teams will typically shift as physical maintenance will no longer be required.
Cloud based solutions are globally accessible by nature – access to cloud is done over the Internet so cloud native applications can be accessed from anywhere. While this is an advantage for solutions that need to be globally distributed, it also means that developers can no longer assume that access will always happen from on-premise networks. Therefore, security design is extremely important to make sure that only people that are authorized, can access these applications. To achieve this, any cloud architecture should take advantage of the numerous security mechanisms that cloud providers make available.
Cloud based architecture benefits from a higher level of scalability – solutions designed on the Cloud are not limited to assets owned in on-premise data centers, and cloud providers typical offer a high level of scalability for their services. Therefore, scaling up or out cloud solutions is usually a matter of a few clicks.
To help better understand how all these concepts fit together, in the next sections we will explain some cloud architecture examples.
Cloud architecture example 1: Building an enterprise data hub
The context for this first Azure cloud architecture example is the following: “PolyR is a company specialized in production of various plastic-based goods. Their Head Quarters are in Belgium, but their production sites are spread across Europe and Asia. At the moment, the company has all its data stored in an on-premise data center located at their HQ. It is worth noting that the company owns multiple systems for the various departments, which are not well integrated with each other:
ERP system is backed up by a set of databases on-premise
CRM system was recently changed to a cloud-based service, so data is stored on provider’s side
Each plant stores production data on local databases
During the previous general assembly, it has been decided that the company would like to build a central data hub. The purpose of this data hub is to centralize data assets, to increase data source visibility across the company and to facilitate cross system data combinations.” The requirements defined during PolyR’s general assembly could be implemented with a solution on the Cloud. The diagram below is a high-level design of the architecture to implement.
In this example, the cloud architecture design supports the following features:
Data on Azure is stored as blobs in a Gen2 Data Lake Storage Account. This storage solution is highly available, encrypts data at rest and is scalable, which makes it a perfect candidate for data lakes. In addition, Gen2 Storage Accounts, support hierarchal folder structure, which combined with POSIX security groups, allows a fine-grained access control over the Storage Account. PolyR can therefore use this service to store all its data both in its raw format and in a more standardized one.
Batch data ingestion can be handled using Data Factory. This data integration service is easy to use, has multiple built-in service connections, and can reach both other cloud services and on-premise ones. All these features make it a good candidate for batch ingestion and orchestration. PolyR could use this service to connect to on-premise servers and upload the raw data (ERP extracts, reference data from plants, etc.) to the Data Lake. Moreover, since PolyR’s CRM provider, supports API connections to their backend, Data Factory is also able to extract CRM data as well. Lastly, Data Factory could also be used to export data from the Cloud back to on-premise, to continue support applications on-premise.
Real-time data ingestion can be handled using Event Hub. This data ingestion service is simple to use, scalable and integrates seamlessly with other services. This makes it a good candidate for streaming data. Although there are other data streaming services such as Azure IoT Hub, PolyR needs a solution that can stream events less than 1MB in size, and bi-directional communication is not necessary. Therefore, Event Hub is preferred over IoT Hub in order to ingest raw data from sensors on their production lines. Lastly, Event Hub has a feature that makes persisting data into the Data Lake easy, which is perfect for long term retention.
Once raw data is landed on the Cloud, it can be standardized using Databricks. This data processing service enables large-scale processing of both batch and real-time data, supports workflows in Scala, Python, SQL and R, and is easily scalable. Therefore, it is a perfect candidate for processing raw data that has been landed in the Data Lake in order to standardize it (apply standard structure, convert data types, convert file format, etc.). The result can then be sent back to the Data Lake to a curated layer.
Cloud architecture example 2: Self-service reporting
In this second use case, PolyR would like to leverage its Cloud Data Hub for reporting purposes:
“Having launched the Cloud Data Hub initiative, PolyR’s management would like now to take the opportunity to redefine reporting standards within the company. At the moment, each department is responsible for its own reporting. Therefore, the corporate landscape is fractured, and multiple reporting solutions coexist. Some departments use specialized tools to create and serve reports, but Excel & PowerPoint are popular because of the independence they offer. PolyR would like to set up a common self-service reporting solution that would still allow some degree of freedom to each department.”
The following diagram summarizes a high-level architecture that could answer to PolyR’s needs.
In this cloud architecture example,the cloud architecture design supports the following features:
Data needed for reporting can be modeled in a Synapse Analytics Workspace. This analytics service is a combination of SQL engines, Spark engines and data pipelines that provides a workspace data engineers can use to model and serve data to other applications. Since, PolyR estimates the size of the data to be above 1TB, and it is necessary to model curated data from the Data Lake, this service is a good candidate to get the job done. Data engineers will be able to use data pipelines to copy data from the central Data Lake into landing tables in SQL. Then use SQL and Spark engines to model data to enterprise grade requirements.
Modeled data can be served to consumers using Power BI Service. This cloud service is a Software as a Service (SaaS) offering that includes a collection of software, apps and a web portal that can be used to display and manage reports. SQL views built in Synapse by the data engineers at PolyR, will be accessible from Power BI. Thus, allowing the users to build custom data sets and reports they can share with the rest of the enterprise. Although Power BI has an offering that allows customers to host the Power BI service on on-premise servers, PolyR decided to go with the Cloud version as to not have to manage the back-end infrastructure.
Cloud architecture example 3: Real-Time analysis and predictions
In this last use case PolyR would like to leverage its existing cloud infrastructure to implement a data science project: “Having been convinced by the cloud initiatives taken so far, the manager of one of PolyR’s plants would like to take advantage of cloud computing in order to leverage data available from the sensors installed on the production lines. These sensors collect data during manufacturing of plastics and send it to a database on the factory’s premises. Currently, the data is used by laboratory clerks to ensure production quality, but the current system is reaching its lifecycle end, and will have to be replaced. In addition, the manager would like to use this opportunity to facilitate laboratory clerks’ job in analyzing data and detecting production defects using a predictive data model.”
The following diagram summarizes a high-level architecture that could help the plant manager achieve this goal.
In this example, the cloud architecture design supports the following features:
Existing Data Lake storage can be leveraged to serve historical events, that have been persisted as files, as training data sets for developing machine learning models.
Existing Event Hubs can be leveraged as real-time input of files to which the machine learning model can listen to, in order to analyze the data produced by sensors in the plant.
Predictive models can be created using Machine Learning Workspaces. This service is a cloud-based environment that allows data scientists to train, deploy and track machine learning models. Moreover, this service comes with operational features that facilitate the lifecycle management of trained models. PolyR’s data scientists will be thus able to use this service to create the models needed to analyze the production events, store the resulting images, then deploy them to docker containers for execution.
Containerized models will be running on Kubernetes Services. This service is the Azure implementation of Kubernetes system, which is a tool for automating deployment, scaling, and management of containerized applications. In this case, Kubernetes will be used to host the machine learning model and will be listening to Event Hub for data to be processed.
The results of the machine learning model will be sent to Cosmos DB. This service is a non-relational database characterized by low latency, high availability and automatic scalability, that has been optimized for app development. These characteristics will ensure the results processed by the predictive model can be stored and immediately available for use.
Lastly, the existing Synapse Analytics & Power BI services can be leveraged to serve reporting data. This will ensure that the critical operation data store is separated from the reporting data store, avoiding interferences. While offering at the same time flexibility in terms of data exploration and dashboarding.
Conclusion Hopefully this article served its purpose of introducing what cloud architecture examples are and sparking interest in this wonderful field. As a closing note, it is important to highlight the fact that the Cloud is vast, and all kind of combinations are possible in order to bring ideas to fruition! Although the focus in this article emphasized the use of Azure as cloud provider, it doesn’t mean that projects must be locked with only one provider. Neither does it meant that cloud projects cannot be a hybrid solution mixing cloud and on-premise resources. In cloud projects, what matters is not the cloud architecture itself, but rather its capability of achieving the desired control level over the final solution, the fit it has with business requirements, and the value it ultimately brings to the organization.
Additional resources about Azure:
Comments