Fungible reveals the Data Processing Unit

Must Read
Sienna Rowley
Sienna is an editor at Cloud Host News. She is an internet enthusiast, always eager to explore the latest trend in the tech space. In her free time, she is a modest family woman who loves traveling.

At Hot Chips 2020 Fungible declared Fungible Data Processing Unit. The company stated that Fungible DPU is a transformational technology that will power next-generation, high-performance, effective, and cost-optimized scale-out data centers. Fungible’s latest DPU is optimized for data exchange and data-centric computations and likewise accompanies the CPU as well as GPU.

Availability

The Fungible DPU is accessible instantly at two performance periods:

  • Fungible F1 DPU – an 800Gbps processor created particularly for high-performance storage, analytics, and security platforms.
  • Fungible S1 DPU – a 200Gbps processor effectively enhanced for host-side use cases incorporating storage initiator, bare-metal virtualization, NFVi/VNF applications, and distributed node security.

Fungible DPU includes two core innovations:

  • The latest network engine that deploys the endpoint of a high-performance TrueFabricTM that gives deterministic low latency, total cross-section bandwidth, congestion and error check, and high security at any measure (from 100s to 100,000s of nodes).
  • A programmable data-path engine that performs data-centric computations at very high speeds, while giving versatility relative to common-purpose CPUs. The engine is programmed in C utilizing industry-standard toolchains and is meant to perform several data-path computations simultaneously.

CEO and Co-founder of Fungible, Pradeep Sindhu stated that The Fungible DPU is built with a purpose to address two of the greatest hurdles in scale-out data centers, the ineffective performance of data-centric computations, and ineffective data interchange among nodes. Data-centric computations are growingly widespread in data centers, with significant patterns being the computations executed in the storage, network, security, and virtualization data-paths. Currently, these computations are executed inefficiently by current processor architectures. This incompetence creates overprovisioning and not enough use of resources, ending in data centers that are significantly more costly to build and run. Excluding this incompetence will further stimulate the generation of modern applications, like AI, and analytics.

For more Hardware News signup for our newsletters

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest News

DoppelPaymer ransomware operation getting rebranded, now named as Grief (alias Pay or Grief)

Doppel Paymer ransomware operation made a rebranding move. Following a period of little or no activity, now they are back...

More Articles Like This