<thead id="zbxfl"><delect id="zbxfl"><output id="zbxfl"></output></delect></thead>
    <sub id="zbxfl"><var id="zbxfl"></var></sub>

        <thead id="zbxfl"><var id="zbxfl"><output id="zbxfl"></output></var></thead><thead id="zbxfl"><var id="zbxfl"><ruby id="zbxfl"></ruby></var></thead>
        <sub id="zbxfl"><delect id="zbxfl"><output id="zbxfl"></output></delect></sub><sub id="zbxfl"><delect id="zbxfl"><output id="zbxfl"></output></delect></sub>
        <sub id="zbxfl"><var id="zbxfl"><mark id="zbxfl"></mark></var></sub>

        <sub id="zbxfl"><var id="zbxfl"></var></sub><sub id="zbxfl"></sub>

        <sub id="zbxfl"><var id="zbxfl"><ruby id="zbxfl"></ruby></var></sub>

        Accelerating Scientific Innovation

        Enabling the Next Generation of Data Analytics at up to 200Gb/s


        NVIDIA Mellanox InfiniBand Products

        NVIDIA? Mellanox? InfiniBand solutions incorporate In-Network Computing technology that performs data algorithms within the network, delivering ten times higher performance, and enabling the era of “data-centric” data centers to enabling compute clusters and converged data centers to operate at any scale while reducing operational costs and infrastructure complexity.

        NVIDIA Mellanox ConnectX-6 VPI adapter

        InfiniBand (VPI) Adapters

        Enhancing the top Supercomputers and Clouds

        NVIDIA Mellanox's line of InfiniBand products deliver the highest productivity, enabling compute clusters and converged data centers to operate at any scale while reducing operational costs and infrastructure complexity.

        InfiniBand (VPI) Adapter ICs

        World-class cluster, network, and storage performance

        The ConnectX? family with Virtual Protocol Interconnect (VPI), supporting InfiniBand and Ethernet connectivity with hardware offload engines to Overlay Networks ("Tunneling"), provides the highest performing and most flexible interconnect solution for PCI Express Gen3 and Gen 4  servers.

        NVIDIA Mellanox ConnectX-6 adapter silicon
        NVIDIA Mellanox BlueField-2 smart adapter

        High-Performance Programmable DPUs

        Empowering the Next Generation of Secure Cloud

        NVIDIA Mellanox BlueField? Data Processing Unit (DPU) with advanced software and programmability, provides data-centers with levels of performance, security and functionality for innovative networking and I/O acceleration.

        InfiniBand Switches

        World-class High-Performance Computing at 40/56/100/200Gb/s Port Speeds

        NVIDIA Mellanox InfiniBand switch systems deliver the fastest data speed and lowest latency with smart accelerators to deliver the highest efficiency and resiliency and are the best choice to connect the world’s top HPC and artificial intelligence supercomputers.

        NVIDIA Mellanox QM8790 InfiniBand smart switch
        NVIDIA Mellanox Quantum switch silicon

        InfiniBand/VPI Switch Silicon

        Building the Highest Performing Server and Storage System Interconnect Solution

        With reduced power, footprint and fully integrated PHY capabilities, the NVIDIA Mellanox switch silicon ICs provide network architects critical feature-sets to enable fabric-flexible server and storage systems to meet the increasing performance demands of their customers.

        InfiniBand Interconnect

        LinkX InfiniBand Cables and Transceivers

        Mellanox LinkX? cables and transceivers are designed to maximize the performance of High Performance Computing networks, requiring high-bandwidth, low-latency connections between compute nodes and switch nodes.

        NVIDIA Mellanox LinkX cables
        NVIDIA Mellanox Skyway

        Gateway & Router Systems

        InfiniBand to Ethernet Gateway Appliance for High Performance and Cloud Data Centers

        Performance-hungry data center environments and applications leverage InfiniBand's high data throughput, extremely low latency, and In-Network Computing acceleration engines to deliver world-leading application performance and scalability.

        Long-Reach Systems

        InfiniBand Long-Haul Systems

        NVIDIA Mellanox Long-Haul systems enable the seamless connectivity of remote InfiniBand data centers, storage or other remote InfiniBand platforms together. MetroX-2 and MetroX extend the availability of InfiniBand’s high data throughput, native Remote Direct Memory Access (RDMA) communications and advanced routing, and other advanced capabilities to remote InfiniBand platforms, with a distance of tens of kilometers between them.

        NVIDIA Mellanox MetroX TX6240
        NVIDIA Mellanox UFM

        Management Systems

        AI-powered Cyber Intelligence and Analytics Platforms

        The NVIDIA Mellanox UFM? platforms revolutionize data center networking management, by combining enhanced, real-time network telemetry with AI-powered cyber Intelligence and analytics to support scale-out InfiniBand data centers.