0% found this document useful (0 votes)
149 views7 pages

Cloud Computing Brochure

The document discusses the need for new system-on-chip (SoC) architectures to support evolving cloud computing and high performance computing (HPC) workloads driven by growth in internet traffic, AI, and other applications. It describes Synopsys' portfolio of IP solutions for developing SoCs for cloud and HPC, including high-speed interfaces, security IP, processors and accelerators to enable high performance, low latency and low power designs for applications like AI and networking. Synopsys provides a wide range of interface, memory, processing and acceleration IP optimized for advanced nodes from 16nm to 5nm to address the specialized requirements of cloud and HPC SoCs.

Uploaded by

Roshan Raju
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
149 views7 pages

Cloud Computing Brochure

The document discusses the need for new system-on-chip (SoC) architectures to support evolving cloud computing and high performance computing (HPC) workloads driven by growth in internet traffic, AI, and other applications. It describes Synopsys' portfolio of IP solutions for developing SoCs for cloud and HPC, including high-speed interfaces, security IP, processors and accelerators to enable high performance, low latency and low power designs for applications like AI and networking. Synopsys provides a wide range of interface, memory, processing and acceleration IP optimized for advanced nodes from 16nm to 5nm to address the specialized requirements of cloud and HPC SoCs.

Uploaded by

Roshan Raju
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

DesignWare IP for Cloud

Computing SoCs
Overview
Hyperscale cloud data centers continue to evolve due to tremendous Internet traffic growth from online collaboration, smartphones and
other IoT devices, video streaming, augmented and virtual reality (AR/VR) applications, and connected AI devices. This is driving the need
for new architectures for compute, storage, and networking such as AI accelerators, Software Defined Networks (SDNs), communications
network processors, and solid state drives (SSDs) to improve cloud data center efficiency and performance. Re-architecting the cloud data
center for these latest applications is driving the next generation of semiconductor SoCs to support new high-speed protocols to optimize
data processing, networking, and storage in the cloud. Designers building system-on-chips (SoCs) for cloud and high performance computing
(HPC) applications need a combination of high-performance and low-latency IP solutions to help deliver total system throughput. Synopsys
provides a comprehensive portfolio of high-quality, silicon-proven IP that enables designers to develop SoCs for high-end cloud computing,
including AI accelerators, edge computing, visual computing, compute/application servers, networking, and storage applications. Synopsys’
DesignWare® Foundation IP, Interface IP, Security IP, and Processor IP are optimized for high performance, low latency, and low power, while
supporting advanced process technologies from 16-nm to 5-nm FinFET and future process nodes.

High-Performance Computing
Today’s high-performance computing (HPC) solutions provide detailed insights into the world around us and improve our quality of life. HPC
solutions deliver the data processing power for massive workloads required for genome sequencing, weather modeling, video rendering,
engineering modeling and simulation, medical research, big data analytics, and many other applications. Whether deployed in the cloud or
on-premise, these solutions require high performance and low-latency compute, networking, and storage resources, as well as leading edge
artificial intelligence capabilities. Synopsys provides a comprehensive portfolio of high-quality, silicon-proven IP that enables designers to
develop HPC SoCs for AI accelerators, networking, and storage systems.

56G/112G DDR5/4 or Security Benefits of Synopsys DesignWare IP for HPC


Embedded Memories

USR/XSR PHY HBM2/2E Protocol


Logic Libraries

Processors PHY Accelerators


• Industry’s widest selection of high-performance
HBI Die-to-Die
PHY DDR5/4 or Root of Trust interface IP, including DDR, PCI Express, CXL, CCIX,
Cache
HBM2/2E Ethernet, and HBM, offers high bandwidth and low
Controller Controller Cryptography
Processing latency to meet HPC requirements
Die-to-Die I/F Subsystem Memory I/F Security
• Highly integrated, standards-based security IP solutions
Interconnect
enable the most efficient silicon design and highest
Cache Coherent Peripheral I/F Network I/F Storage I/F
Expansion levels of data protection
CCIX/CXL PCIe 5.0 or 6.0 Up to 800G NVMe
Controller Controller Ethernet • Low latency embedded memories with standard and
Controller
Inline AES Inline AES PCIe 5.0 or 6.0 ultra-low leakage libraries, optimized for a range of
Cryptography Cryptography Controller
56G/112G cloud processors, provide a power- and performance-
PCIe 5.0 or 6.0 PCIe 5.0 or 6.0 Ethernet PCIe 5.0 or 6.0
PHY PHY PHYs PHY efficient foundation for SoCs

IP for HPC SoCs in Cloud Computing

2
Artificial Intelligence (AI) Accelerators
AI accelerators process tremendous amounts of data for deep learning workloads including training and inference which require large
memory capacity, high bandwidth, and cache coherency within the overall system. AI accelerator SoC designs have myriad requirements,
including high performance, low power, cache coherency, integrated high bandwidth interfaces that are scalable to many cores,
heterogeneous processing hardware accelerators, Reliability-Availability-Serviceability (RAS), and massively parallel deep learning neural
network processing. Synopsys offers a portfolio of DesignWare IP in advanced FinFET processes that address the specialized processing,
acceleration, and memory performance requirements of AI accelerators.

56G/112G
USR/XSR Benefits of Synopsys DesignWare IP for AI

Embedded Memories
Neural HBM2E PHY
PHY
Network Accelerators

Logic Libraries
Processors
HBI PHY • Industry’s widest selection of high-performance
HBM2E interface IP, including DDR, USB, PCI Express (PCIe),
Controller
Controller Cache CXL, CCIX, Ethernet, and HBM, offers high bandwidth
Die-to-Die I/F AI Accelerator Memory I/F and low latency to meet the high-performance
Interconnect requirements of AI servers

Security Network I/F Cache Coherent • Highly integrated, standards-based security IP solutions
Expansion
enable the most efficient silicon design and highest
Security Protocol Up to 800G CCIX/CXL
Accelerators Ethernet Controller levels of data protection
Controller
Inline AES • Low latency embedded memories with standard and
Root of Trust Cryptography
56G/112G ultra-low leakage libraries, optimized for a range of
Ethernet cloud processors, provide a power- and performance-
PCIe 5.0 PHY
Cryptography PHYs
efficient foundation for SoCs

IP for Core AI Accelerator

Edge Computing
The convergence of cloud and edge is bringing cloud services closer to the end-user for richer, higher performance, and lower latency
experiences. At the same time, it is creating new business opportunities for cloud service providers and telecom providers alike as they
deliver localized, highly responsive services that enable new online applications.

These applications include information security, traffic and materials flow management, autonomous vehicle control, augmented and virtual
reality, and many others that depend on rapid response. For control systems in particular, data must be delivered reliably and with little time
for change between data collection and issuing of commands based on that data.

To minimize application latency, service providers are moving the data collection, storage, and processing infrastructure closer to the point of
use—that is, to the network edge. To create the edge computing infrastructure, cloud service providers are partnering with telecommunications
companies to deliver cloud services on power- and performance-optimized infrastructure at the network edge.B

56G/112G Security
Benefits of Synopsys DesignWare IP for Edge
Embedded Memories

USR/XSR DDR5/4 Protocol


PHY
Computing
Logic Libraries

Processors Controller Accelerators

HBI PHY
Root of Trust • Industry’s widest selection of high-performance
DDR5/4
Cache PHY interface IP, including DDR, USB, PCI Express, CXL, CCIX,
Controller Cryptography
Processing Ethernet, and HBM, offers high bandwidth and low
Die-to-Die I/F Subsystem Memory I/F Security
latency to meet the high-performance requirements of
Interconnect
edge computing servers
Network I/F Storage I/F Management I/F Peripheral I/F
PCIe 5.0/6.0/
• Highly integrated, standards-based security IP solutions
Up to 800G NVMe 10G
Ethernet Ethernet
CXL Controller USB 3.2 enable the most efficient silicon design and highest
Controller Controller
PCIe 5.0 or 6.0 Controller
Inline AES levels of data protection
Controller Cryptography
56G/112G 10G
USB 3.2
• Low latency embedded memories with standard and
Ethernet PCIe 5.0 or Ethernet PCIe 5.0 or
PHYs 6.0 PHY PHYs 6.0 PHY
PHY ultra-low leakage libraries, optimized for a range of
edge systems, provide a power- and performance-
IP for Edge Server SoC efficient foundation for SoCs

3
Visual Computing
As cloud applications evolve to include more visual content, support for visual computing has emerged as an additional function of cloud
infrastructure. Applications for visual computing include streaming video for business applications, online collaboration, on-demand movies,
online gaming, and image analysis for ADAS, security, and other systems that require real-time image recognition. The proliferation of
visual computing as a cloud service has led to the integration of high-performance GPUs into cloud servers, connected to the host CPU
infrastructure via high-speed accelerator interfaces.

56G or 112G Security


Benefits of Synopsys DesignWare IP for Visual HBM2/2E GDDR6

Embedded Memories
USR/XSR PHY Graphics Protocol
PHY PHY

Logic Libraries
Computing HBI Die-to-
Processor Accelerators

Die PHY Root of Trust


• Silicon-proven PCIe 5.0 IP is used by 90% of leading HBM2/2E GDDR6
Cache Controller Controller
semiconductor companies Controller Cryptography
Processing
Die-to-Die I/F Subsystem Memory I/F Security
• CXL IP is built on silicon-proven DesignWare PCI
Interconnect
Express 5.0 IP for reduced integration risk and
Cache Coherent Video Subsystem Display Subsystem
provides cache coherency to minimize data copying Expansion & Host I/F
PCIe 5.0 or 6.0 or Display Engine
within the system CXL Controller
Inline AES Multimedia DisplayPort
• HBM2/2E IP is optimized for power efficiency, using Cryptography Engine Controller
HDMI Controller

80% less power than competitive solutions PCIe 5.0 or 6.0 PHY DisplayPort PHY HDMI PHY

Server-based graphics accelerator block diagram

Servers
The growth of cloud data is driving an increase in compute density within both centrally located hyperscale data centers and remote
facilities at the network edge. The increase in compute density is leading to demand for more energy-efficient CPUs to enable increased
compute capability within the power and thermal budget of existing data center facilities. The demand for more energy-efficient CPUs has
led to a new generation of server CPUs optimized for performance/watt.

This same increase in data volume is also driving demand for faster server interfaces to move data within and between servers. Movement of
data within the server can be a major bottleneck and source of latency. Minimizing data movement as much as possible and providing high-
bandwidth, low-latency interfaces for moving data when required are key to maximizing performance and minimizing both latency and power
consumption for cloud and HPC applications. To improve performance, all internal server interfaces are getting upgrades:

• DDR5 interfaces are moving to 6400 MBps


• Doubling the bandwidth of PCIe interfaces as they move from PCIe 4.0 at 16GT/s to PCIe 5.0 at 32GT/s and PCIe 6.0 at 64GT/s
• Compute Express Link (CXL) provides a cache coherent interface that runs over the PCIe electrical interface and reduces the amount
of data movement required in a system by allowing multiple processors/accelerators to share data and memory efficiently
• New high-speed SerDes technology at 56Gbps and 112Gbps using PAM4 encoding and supporting protocols enable faster interfaces
between devices including die, chips, accelerators, and backplanes

56G/112G Security
Benefits of Synopsys DesignWare IP for Cloud
Embedded Memories

USR/XSR DDR5/4 Protocol


PHY
Compute Servers
Logic Libraries

Processors Controller Accelerators

HBI PHY
Root of Trust • Silicon-proven PCIe 5.0 IP is used by 90% of leading
DDR5/4
Controller
Cache PHY
Cryptography
semiconductor companies
Processing • CXL IP is built on silicon-proven DesignWare PCI
Die-to-Die I/F Subsystem Memory I/F Security

Interconnect Express 5.0 IP for reduced integration risk and supports


Network I/F Accelerator I/F Peripheral I/F Storage I/F Management I/F storage class memory (also referred to as persistent
Up to 800G PCIe 5.0/6.0/ NVMe
memory) for speed approaching that of DRAM with
10G
CXL Controller USB 3.2
Ethernet
Controller
Ethernet SSD-like capacity and cost
Controller PCIe 5.0/6.0/ Controller
Inline AES
Cryptography CXL Controller • 112Gbps XSR/USR SerDes supports a wide range of
56G/112G 10G
Ethernet PCIe 5.0 or USB 3.2 PCIe 5.0 or Ethernet data rates (2.5 to 112 Gbps) with area-optimized RX
PHYs 6.0 PHY PHY 6.0 PHY PHYs

Cloud server block diagram


4
Networking
Traditional data centers use a tiered network topology consisting of switched Ethernet with VLAN tagging. This topology only defines one path to
the network, which has traditionally handled north-south data traffic. The transition to a flat, two-tier leaf-spine hyperscale data center network using
up to 800G Ethernet links enables virtualized servers to distribute workflows among many virtual machines, creating a faster, more scalable cloud
data center environment.

Smart network interface cards (NICs) combine hardware, programmable AI acceleration, and security resources to offload server
processors, freeing the processors to run applications. Integrated security, including a root of trust, protects coefficient and biometric
data as it moves to and from local memories. Smart NICs accelerate embedded virtual switch, transport offloads, and protocol overlay
encapsulation/decapsulation such as NVGRE, VXLAN and MPLS. By offering dedicated hardware offloads including NVMe-over-Fabric
(NVMEoF) protocols, Smart NICs free the server CPU to focus compute cycles on cloud application software and enable efficient data
sharing across nodes for HPC workloads.

Network switch SoCs enable cloud data center top-of-rack and end-of-row switches and routers to scale port densities and speeds to
quickly adapt to changing cloud application workloads. By scaling port speeds from 10Gb Ethernet to 400/800G Ethernet and extending
port densities from dozens to hundreds of ports, the latest generation Ethernet switch SoCs must scale to provide lowest latency and
highest throughput flow control and traffic management. Synopsys’ DesignWare Interface IP portfolio supports high-performance
protocols such as Ethernet, PCI Express, CXL, CCIX, USB, DDR, and HBM. DesignWare Interface IP is optimized to help designers meet the
high-throughput, low-latency connectivity needs of cloud computing networking applications. Synopsys’ Foundation IP offers configurable
embedded memories for performance, power, and area, as well as high-speed logic libraries for all processor cores.

Communication service providers are turning towards server


virtualization to increase efficiency, flexibility, and agility to 56G/112G
USR/XSR DDR5/4 or

Embedded Memories
optimize network packet processing. The latest communications PHY ARC HBM2E PHY

Logic Libraries
Processors
architecture uses Open vSwitch Offloads (OVS), OVS over Data
Plane Development Kits (DPDK), network overlay virtualization, HBI PHY
DDR5/4 or
SR-IOV, and RDMA to enable software defined data center HBM2E
Cache
Controller Controller
and Network Function Virtualization (NFV), accelerating
Processing
communications infrastructure. To achieve higher performance, Die-to-Die I/F Subsystem Memory I/F
communications network processors can accelerate OVS offloads Interconnect
for efficiency and security. Synopsys provides a portfolio of Cache Coherent
Security Accelerators Network I/F
high-speed interface IP including DDR, HBM, Ethernet for up to Expansion
800G links, CXL for cache coherency, and PCI Express for up to Security CCIX/CXL 100G
Protocol DSP Controller Ethernet
64GT/s data rates. DesignWare Security IP enables the highest Accelerators Controller
Inline AES
levels of security encryption, and embedded ARC processors Cryptography
offer fast, energy-efficient solutions to meet throughput and Root of Trust Packet 56G/112G
Processing PCIe 5.0 or Ethernet
QoS requirements. Synopsys’ Foundation IP delivers low-latency Accelerator
Cryptography 6.0 PHY PHYs
embedded memories with standard and ultra-low leakage libraries
for a range of cloud processors.
IP for Smart NIC in cloud computing network

Processing 56G/112G Security


Die-to-Die I/F Memory I/F Security DDR5/4 or
Embedded Memories

Subsystem USR/XSR Protocol


ARC HBM2E PHY
Embedded Memories

56G/112G PHY
Logic Libraries

Security Accelerators
USR/XSR DDR5/4 or Processors
Protocol
Logic Libraries

PHY HBM2E PHY Accelerators HBI PHY


ARC Root of Trust
Processors DDR5/4 or
HBM2E
HBI PHY Root of Trust Cache
DDR5/4 or Controller Controller Cryptography
HBM2E Processing
Controller Cache Controller Cryptography Die-to-Die I/F Subsystem Memory I/F Security

Interconnect
LookUp Packet Host I/F Management Network I/F
Processing I/F Packet Processing & Host I/F Cache Coherent Management Network
SDN Accelerators Expansion I/F I/F
TCAM/ PCIe 5.0 PHY 10G Up to 800G PCIe 5.0 or
CCIX/CXL 10G Up to 800G
SRAM Scheduler Ethernet Ethernet Controller 6.0 Controller Controller Ethernet Ethernet
Controller
Application Inline AES Inline AES Controller Controller
Inline AES
Cryptography Accelerator Cryptography Cryptography
10G 10G
Application Parsers 56G/112G 56G/112G
Ethernet PCIe 5.0 or PCIe 5.0 Ethernet
Accelerator PCIe 5.0 Ethernet PHYs Ethernet PHY
PHYs 6.0 PHY PHY PHY
Controller

IP for cloud computing network switch IP for communication network processors

5
Benefits of Synopsys DesignWare IP for Cloud Computing Networking
• Synopsys’ portfolio of IP in advanced foundry processes, supporting high-speed protocols such as DDR, HBM, Ethernet,
USB, CCIX, CXL, and PCI Express, are optimized to meet the high-throughput, low-latency connectivity needs of hyperscale
data center networking and cloud communications network processor applications
• Low latency embedded memories with standard and ultra-low leakage libraries, optimized for a range of cloud processors,
provide a power- and performance- efficient foundation for SoCs
• Configurable AMBA interconnects with a library of peripheral components deliver SoC design flexibility and minimize
design complexity
• Highly integrated, standards-based security IP solutions enable the most efficient silicon design and highest levels of
security encryption
• ARC processors, supported by a broad spectrum of 3rd-party tools, operating systems and middleware from leading industry
vendors, offer high-speed, energy-efficient IP to meet throughput and QoS

Storage
NVMe-based Solid-State Drives (SSDs) can utilize a PCIe interface to directly connect to the server CPU and function as a cache accelerator allowing
frequently accessed data, or “hot” data, to be cached extremely fast. High-performance PCIe-based NVMe SSDs with extremely efficient input/
output operation and low-read latency improve server efficiency and avoid having to access the data through an external storage device. NVMe
SSD server acceleration is ideal for high transaction applications such as AI acceleration or database queries queries, as well as HPC workloads
that require high-performance, low-latency access to large data sets. PCIe-based NVMe SSDs not only reduce power and cost but also minimize
area compared to hard disk drives (HDDs). Synopsys’ portfolio of DesignWare Interface IP for advanced foundry processes, supporting high-speed
protocols such as PCI Express, USB, and DDR, are optimized to help designers meet their high-throughput, low-power, and low-latency connectivity
for cloud computing storage applications. Synopsys’ Foundation IP offers configurable embedded memories for performance, power, and area, as
well as high-speed logic libraries for all processor cores. Synopsys also provides processor IP ideally suited for flash SSDs.

56G/112G Security
AES-256
Benefits of Synopsys DesignWare IP for Cloud
Embedded Memories

USR/XSR DDR5/4 PHY Protocol


PHY ARC Cryptography
Logic Libraries

Accelerators
Processors
Storage
HBI PHY
HW RAID & Root of Trust
Cache
DDR5/4
Controller CRC • High-performance, low-latency PCI Express controllers
Controller Cryptography
Processing Data Integrity &
and PHYs supporting data rates up to 64GT/s enable
Die-to-Die I/F Subsystem Memory I/F Protection Security
NVMe-based SSDs
Interconnect

Peripheral I/F Storage I/F Management I/F Network I/F Flash I/F • High-performance, low-power ARC processors support
USB 3.2/4.0
NVMe 10G 100G fast read/write speeds for NVMe-based SSDs
Ethernet Ethernet LPDC ECC
Controller Controller Controller
PCIe 5.0/6.0/CXL
Controller • Portfolio of interface IP including Ethernet, USB,
USB 3.2/4.0 PCIe 5.0 or
10G
Ethernet
56G/112G
Ethernet
ONFI NAND PCI Express, and DDR provides low latency and
PHY 6.0 PHY Flash Control
PHYs PHYs
fast read/write operations

Figure 6: IP for cloud computing storage

6
Interface IP
NRZ and PAM-4 112G and 56G Ethernet PHYs and configurable controllers for up to 800G
Ethernet Controller and PHY
hyperscale data center SoCs
DDR memory interface controllers and PHYs supporting system performance up to 6400 Mbps,
DDR5/4 Controller and PHY
share main memory with compute offload engines plus network and storage I/O resources
HBM2/2E PHY HBM2/2E IP allows high memory throughput with minimal power consumption
USB Controller and PHY Complete USB IP solution reduces engineering effort while reducing area
High-performance, low-latency PCI Express controllers and PHYs supporting data rates up to
PCI Express Controller and PHY
64GT/s enables real-time data connectivity and NVMe SSDs and SD Express cards
Compute Express Link (CXL) Very high-bandwidth with extremely low latency IP supporting all three CXL protocols (CXL.io,
Controller and PCIe 5.0 PHY CXL.cache, CXL.mem) and device types
CCIX IP solutions support data transfer speeds up to 32 Gbps and cache coherency for faster
CCIX Controller and PHY
data access
Security IP
Highly integrated, standards-based security IP solutions enable the most efficient silicon design
Security IP
and highest levels of security
Foundation IP
Embedded Memories and Logic Low latency embedded memories with standard and ultra-low leakage libraries provide a power-
Libraries and performance-efficient foundation for SoCs
Processor IP
Highly scalable ARC HS processors provide the high performance and energy efficiency
ARC HS Processors required for network control plane processing, computational storage, AI co-processing, and
other embedded processor applications in the cloud

About DesignWare IP
Synopsys is a leading provider of high-quality, silicon-proven IP solutions for SoC designs. The broad DesignWare IP portfolio
includes logic libraries, embedded memories, PVT sensors, embedded test, analog IP, wired and wireless interface IP, security
IP, embedded processors, and subsystems. To accelerate prototyping, software development and integration of IP into SoCs,
Synopsys’ IP Accelerated initiative offers IP prototyping kits, IP software development kits, and IP subsystems. Synopsys’ extensive
investment in IP quality, comprehensive technical support and robust IP development methodology enable designers to reduce
integration risk and accelerate time-to-market.

For more information on DesignWare IP, visit synopsys.com/designware .

©2021 Synopsys, Inc. All rights reserved. Synopsys is a trademark of Synopsys, Inc. in the United States and other countries. A list of Synopsys trademarks is
available at synopsys.com/copyright.html . All other names mentioned herein are trademarks or registered trademarks of their respective owners.
05/04/21.CS610890866-SG Bro-Cloud Computing Brochure.

You might also like