The Changing Landscape of Infrastructure and Operations

Posted February 9, 2024 by Sayers 

Today’s IT infrastructure and operations teams are readying for the intersection of multiple technology trends that will impact their ability to deliver scalable, agile, and resilient platforms.

Among those trends discussed at the recent Gartner IT Infrastructure, Operations, and Cloud Strategies conference, three stand out:

  • Generative artificial intelligence, requiring decisions on whether to buy or build your organization’s GenAI solutions
  • Infrastructure platform engineering, bringing an increased focus on ways to deliver applications more quickly and reliably
  • Edge computing, moving compute closer to data sources for IoT and AI applications

Read on for more key takeaways from the conference, which drew some 5,000 attendees and nearly 130 vendors.

Organizations Adopting GenAI Face Several Decisions

Generative AI uses machine learning and AI foundation models to generate new content, product designs, and business processes, among other examples, which resemble but don’t repeat the original data.

Listen to enough analysts and you’ll come away thinking generative AI will be the biggest industry disrupter since the internet. While a “distrust and verify” approach to GenAI’s outputs is warranted, generative AI’s role as a strategic innovation for I&O means more companies will want to try GenAI initiatives.

According to Gartner’s 2024 CIO and Technology Executive Survey, which a Gartner I&O conference keynote highlighted:

80% of CIOs and tech leaders plan full GenAI adoption within three years.

Before adopting generative AI, companies must decide whether they want to buy or build GenAI solutions in their environment. Those decisions include:

  • Leverage OpenAI and publicly available large language models (LLMs) vs. creating and building private LLMs
  • Consume compute and storage in a public cloud or provision infrastructure on premise. 

Gartner recommends starting small, then iterate by testing AI functionality in a proof of concept before moving into production.

Mark McCully, Director of Infrastructure & Operations Modernization Engineering at Sayers, says:

“Companies may want to kick the tires on GenAI, build a chatbot, and expand from there, based on what they think will drive the most business value. It probably will be less expensive initially in the public cloud, rather than purchasing a lot of compute and storage, GPUs, and related requirements.”

Other considerations in your GenAI approach include power consumption and cooling. Next-generation data centers have to handle the power and scale needed to run GenAI, which requires heavy-duty GPU’s and DPU’s (data processing units).  As companies spin up more AI-related workloads, those computing processes generate more heat and require liquid cooling solutions.

Many companies will likely start their GenAI journey in the public cloud using technologies such as Microsoft Copilot, OpenAI, or Hugging Face for access to large language models. 

Expected to play a large role in GenAI adoption is Azure Stack, a portfolio of products that extend Microsoft Azure services and capabilities from the data center to edge locations and remote offices. McCully says:

“Azure is a good vehicle to try GenAI initiatives through a consumption model, without having to outlay a lot of capital or opex expenses to bring things you will need on premise such as GPUs and liquid cooling.”

Infrastructure Platform Engineering Upskills Your Delivery Model

Platform engineering incorporates key tenants such as automation, observability, and self-service into your infrastructure environment so your software development team can deliver stable and secure applications faster.

Key elements of infrastructure platform engineering include: 

  • Container platforms, such as OpenShift and Portworx Kubernetes
  • Resilience with data protection and isolated recovery environments to keep your platform up and running consistently
  • Multi-cloud / portability, with the ability to create a workload in the public cloud or on premise
  • Consumption-based / as-a-service, with offerings such as HPE GreenLake, Dell APEX, and Pure Storage Evergreen//One. Such services allow you to outsource your compute and storage, freeing up your engineers for other areas such as DevOps and programmatic orchestration
  • Observability with solutions such as Dynatrace, New Relic, and LogicMonitor to know what’s going on with your applications and infrastructure performance, and get to the root cause quickly if there’s an issue
  • Repeatable processes
  • Automation, using solutions such as Ansible for IT infrastructure automation. Karen Brown, Director Analyst at Gartner, says:

“By 2026, 30% of enterprises will automate more than half of their network activities, an increase from less than 10% of enterprises in early 2023.”

GenAI and IoT Drive Computing to the Edge 

The growth of AI and the Internet of Things will fuel not only unstructured data growth but also edge computing, where processing occurs closer to data sources for more efficient IoT and AI applications.

According to Gartner: Modernize Data and Analytics Capabilities​ (gartner.com)

By 2025, more than 50% of enterprise-managed data will be created and processed outside the data center or cloud.

Several industries will push the boundaries of edge computing. That’s because much of their data isn’t processed or held in the main data center, but rather at edge locations such as branch offices. 

Those industries and use case examples include manufacturing (system automation), retail (immersive e-commerce), healthcare (heart monitors), finance (high-frequency trading), and energy (real-time grid adjustments based on consumption). 

Public clouds such as Microsoft Azure provide edge capabilities with IoT enablement options, while Azure Stack offers on-prem cloud-like services. 

With a solution like Azure Stack, you can bring specific data sets or applications back down from the cloud to on-premise locations. This avoids the egress charges associated with taking data out of the cloud, and also moves your data closer to your end users who need to consume data from your applications. McCully says:

“While you might be trying to put as much as you can in the cloud, there are going to be certain workloads or edge use cases where it makes sense to use something like Azure Stack to keep the data or the application more local and avoid some egress fees or latency issues.”

Embed Security Into Infrastructure and Operations

The growth of data from AI and IoT brings more concerns about data security as well as application security. Your infrastructure platform engineering initiatives should include conversations about how you embed security into your application development and deployment. 

For securing the edge, options include software-defined wide area networking (SD-WAN), secure access service edge (SASE), and cyber-physical systems security solutions.

Questions? Contact us at Sayers today for help in choosing the right technology solutions for your business.

    Addresses

  • Atlanta
    675 Mansell Road, Suite 115
    Roswell, GA 30076
  • Boston
    25 Walpole Park South, Suite 12, Walpole, MA 02081
  • Rosemont
    10275 W. Higgins Road, Suite 470 Rosemont, IL 60018
  • Vernon Hills - Corporate Headquarters
    960 Woodlands Parkway Vernon Hills, IL 60061

 

  • Bloomington
    1701 E Empire St Ste 360-280 Bloomington, IL 61704
  • Chicago
    233 S Wacker Dr. Suite 9550 Chicago, IL 60606
  • Tampa
    380 Park Place, Suite 130, Clearwater, FL 33759

Have a Question?

Contact us