Tag: #Nvidia

  • Nvidia CEO Jensen Huang Personally Delivers World’s Smallest Supercomputer to Elon Musk

    At SpaceX’s Starbase facility in Texas, Nvidia founder and CEO Jensen Huang personally delivered the company’s recently released DGX Spark AI supercomputer to Elon Musk. The meeting occurred as SpaceX got ready for the eleventh test of its Starship rocket, according to a blog post the chipmaker posted. Huang described the occasion as “delivering the smallest supercomputer next to the biggest rocket.” In the interview with the CEO of SpaceX, Huang described how Spark advances that goal and related the tale of delivering the first DGX machine to OpenAI.

    Meeting Between Musk and Huang

    Before meeting Tesla CEO Elon Musk in the cafeteria, Jensen Huang arrived at the SpaceX location surrounded by engineers and employees. The blog claims that the two had a brief conversation regarding Nvidia’s early partnership with OpenAI and how the new DGX Spark carries on that goal. The system can execute models with up to 200 billion parameters locally, marking a significant advancement in portable AI processing capacity.

    Nvidia’s DGX Spark and its Features

    With a performance of up to one petaflop, the DGX Spark is a small AI supercomputer. Developers, academics, and artists who require powerful processing outside of conventional data centres are the target audience. Approximately 1.2 kg in weight, it combines portability with cutting-edge hardware, including the Nvidia GB10 Grace Blackwell Superchip for high-speed AI processing, 128 GB of unified memory for seamless model training and inference, Nvidia ConnectX networking and NVLink-C2C for quick data transfer, NVMe storage, and HDMI output for direct visualisation.

    It comes preloaded with Nvidia’s entire AI software stack, which includes frameworks, libraries, pretrained models, and NIM microservices, allowing users to create chatbots, vision agents, and creative AI tools locally.

    In order to introduce DGX Spark systems to the market and transform desktop computers into AI-ready workstations, Nvidia has teamed with companies such as Acer, ASUS, Dell Technologies, HP, Lenovo, GIGABYTE, and MSI. Ollama in Palo Alto, NYU Global Frontier Lab, Zipline, Arizona State University, and the studio of artist Refik Anadol are examples of early adopters. According to the company’s announcement, DGX Spark systems will be accessible worldwide via partner channels and Nvidia.com as of October 15.

    Quick Shots

    •Nvidia CEO Jensen Huang
    personally delivered the DGX Spark AI supercomputer to Elon Musk at SpaceX
    Starbase, Texas.

    •Delivery coincided with SpaceX’s
    11th Starship rocket test.

    •Huang called it “delivering the
    smallest supercomputer next to the biggest rocket.”

    •Nvidia GB10 Grace Blackwell
    Superchip, 128 GB unified memory, NVLink-C2C, NVMe storage, HDMI output

    •Preloaded with Nvidia AI software
    stack, frameworks, libraries, pretrained models, and NIM microservices

    •Developers, researchers, artists
    needing powerful AI processing outside traditional data centers

    •DGX Spark Combines portability
    and high-performance AI, enabling local model training, inference, and
    creative AI applications

  • Nvidia CEO Jensen Huang Reveals the One Non-Tech Job That Will Lead the AI Race

    Employees are growing more concerned that their employment may be in jeopardy due to a surge of cost-cutting driven by artificial intelligence (AI). Jensen Huang, the CEO of Nvidia, is also not providing any consolation. He claimed that electricians, plumbers, and carpenters will be the true beneficiaries of the AI era, rather than office workers, in a recent interview with Channel 4 News in the United Kingdom.

    One Huang told the publication that “the skilled craft segment of every economy is going to see a boom,” claiming that the construction of AI data centres will necessitate continuous growth, “doubling and doubling and doubling every single year.”

    Even if recent evidence from the Yale Budget Lab suggests that AI has not yet substantially disrupted the labour market, his viewpoint is gaining momentum among other executives. However, if Huang is right, the talents that demand higher compensation may change over the course of the next ten years.

    Why Corporate and IT Sector are Showing Concerns?

    Huang, whose business just committed $100 billion to OpenAI’s data centre buildout, contends that the true opportunity is in developing the physical infrastructure behind AI, rather than software experts and programmers being the obvious beneficiaries. His forecast is in line with worries expressed by other business executives who perceive a disconnect between the manpower needed to complete the industry’s ambitious data centre buildout and the available capacity.

    Larry Fink, the CEO of BlackRock, Inc. (BLK), for instance, brought up the matter directly with the White House earlier this year, cautioning that a severe labour shortage may result from the combination of tight immigration laws and waning interest in trades among young Americans. “We’re going to run out of electricians that we need to build out AI data centres,” Fink stated during an energy conference in March. “I’ve even told members of the Trump team that.” “We just don’t have enough.”

    Without a college degree, a single 250,000-square-foot data centre can hire up to 1,500 construction workers during buildout, many of whom will make over $100,000 plus overtime. According to a recent McKinsey report, once a data centre is up and running, it supports roughly 50 full-time maintenance jobs, each of which creates an additional 3.5 jobs in the local economy. With data centre capital expenditures expected to reach $7 trillion globally by 2030, the type of labour required by the IT industry may change significantly in the future.

    What New Research Released by Yale’s Budget Lab States?

    Nearly three years after the debut of ChatGPT in November 2022, a new study released 1 October from Yale’s Budget Lab reveals minimal evidence of severe disruption to the labour market. However, compared to earlier technological upheavals like the emergence of the personal computer and the internet, work changes are occurring a little more quickly.

    Despite this, the change has been gradual so far, with the patterns beginning before ChatGPT’s arrival, “undercutting fears that AI automation is currently eroding the demand for cognitive labour across the economy,” according to the paper. The researchers looked at unemployment rates among people in high-risk industries, job shifts in occupations exposed to AI, and overall employment trends. None displayed overt indications of job losses due to AI.

    The vocational shifts seem to have started in 2021, long before generative AI became generally accessible, even in industries with the highest exposure to AI, such as professional, financial, and information services. According to research, fresh college graduates’ work mix differs little from that of older grads, suggesting some potential early consequences. However, the Budget Lab warns that this might be a sign of a sluggish labour market that, as usual, is having a greater impact on younger people.

    Quick Shots

    •Demand for
    AI data centres is set to skyrocket, doubling yearly and requiring massive
    infrastructure buildouts.

    •Huang
    believes physical infrastructure roles will benefit more than software
    developers as AI expands.

    •Labour
    shortages in trades could become a major bottleneck for AI growth, warns
    BlackRock CEO Larry Fink.

    •A single
    data centre can employ 1,500+ construction workers, many earning $100,000+
    annually without a college degree.

  • Nvidia Commits $100 Billion Investment in OpenAI as Sam Altman Calls Compute the Future of Global Economy

    As a sign of the increasing need for artificial intelligence infrastructure, Nvidia has committed to investing up to $100 billion in OpenAI.

    The agreement, one of the biggest in the AI industry, is anticipated to assist OpenAI in increasing its processing capacity through the construction of new data centres furnished with cutting-edge Nvidia chips.

    The two businesses announced on 22 September that they had signed a statement of intent to proceed with the proposal, according to Bloomberg. The investment will be made in phases, beginning with $10 billion when OpenAI uses its first gigawatt of processing capacity, according to people familiar with the talks. In exchange, Nvidia will also get stock in OpenAI.

    Initiative Aims to Create Data Centre with Capacity of 10GW

    In order to train and operate OpenAI’s massive AI models, the initiative intends to build data centres with a combined capacity of 10 gigawatts. The newest processors from Nvidia, which are now the most sought-after chips in the AI sector, will be used in these centres. Everything begins with compute, according to OpenAI CEO Sam Altman, who outlined the significance of the partnership.

    The joint company will use what “we are building with Nvidia to both create new AI breakthroughs and empower people and businesses with them at scale.” Compute infrastructure will be the foundation of the future economy. For OpenAI, the move comes at a critical moment. Almost 700 million individuals use its well-known chatbot, ChatGPT, each week.

    Large amounts of processing power are needed to run these services, and the business has previously experienced shortages during significant product launches. In the upcoming weeks, Altman has already alluded to the introduction of additional “compute-intensive” products from OpenAI, which will require even more equipment.

    Nvidia Strengthening its Position Through this Partnership

    The collaboration solidifies Nvidia’s position at the forefront of the AI revolution. The business has been making use of its financial resources to guarantee that its chips continue to serve as the foundation of AI systems all around the world. Nvidia could further solidify its supremacy even as rivals promote competing technologies by retaining OpenAI as a major client in spite of the latter’s desire to create its own hardware.

    Although neither company has disclosed the precise timeframe for the investment, they have acknowledged that talks are in progress to reach a final deal as soon as possible. The deal’s announcement has already improved market sentiment.

    In New York trade on 22 September, Nvidia’s stock increased by as much as 4%, bringing its overall gain for the year to almost 36%. Given that AI is viewed as a key driver of future growth, the quick increase demonstrates how attentive investors are to the company’s actions in this area.

    Quick
    Shots

    •Sam Altman highlights computing power
    as the foundation of tomorrow’s global economy.

    •Move secures Nvidia’s role as the
    backbone of AI systems worldwide.

    •Despite exploring its own hardware,
    OpenAI remains reliant on Nvidia’s chips.

    •Nvidia’s stock jumped 4% on Sept 22,
    with a 36% gain year-to-date.

  • Nvidia Acquihired Enfabrica and Its CEO for $900M: Acquisition Explained

    Ever heard what an Acquihire is? It is when a company acquires another company and its talent. Nvidia just spent over $900 million on an acquihire. It acquired Enfabrica and not just any talent from its team, but the CEO (plus others) directly. The deal was closed on September 19, 2025. Well, this is unusual for Nvidia because they invest but never acquire the whole company.  Notably, Enfabrica’s technology can connect 100,000+ GPUs (graphics chips) more or less like a giant computer. Does this mean that Nvidia is trying to build complete, ready-to-go AI systems instead of just chips?

    What Did Nvidia Do?

    Nvidia poured $900 million into two things of Enfabrica:

    • To hire Enfabrica’s CEO, Rochan Sankar, and some other employees, too.
    • To get hands on Licensing (renting the rights to use) Enfabrica’s technology.
    Nvidia Acquired Enfabrica for $9000 million
    Nvidia Acquired Enfabrica for $9000 million

    Who Is Enfabrica?

    Enfabrica is a U.S. startup founded in 2019. Its technology connects 100,000+ GPUs (graphics chips) so they work like a big computer. Now, such technology is paramount in terms of AI models (for instance, ChatGPT) because they use huge amounts of computing power.

    Nvidia Acquired Enfabrica for $9000 million
    Nvidia Acquired Enfabrica for $9000 million

    Why Did Nvidia Do This?

    • Nvidia’s GPUs are the most popular ones in the AI industry (they are like the backbone). Almost all tech giants like ChatGPT, Gemini, Claude, and more run on their chips.
    • These GPUs are like shelves full of chips put together.
    • The more the chips, the more powerful the tech is, but putting them together is tough (the more the chips, the harder it gets to work them out).
    • Enfabrica’s technology has the efficiency to solve this problem. Nvidia can now scale up AI supercomputers. So, yes, Nvidia can sell a complete, ready-to-go AI system instead of just chips.

    How Does This Fit Into Nvidia’s History?

    Past acquisitions:

    • In 2019, Nvidia bought Mellanox, spending $6.9B → It’s a networking technology that is backing Nvidia’s Blackwell chips.
    • In 2022, Nvidia failed to buy Arm for $40B because of regulators.
    • In 2024, it bought Run:ai for $700 million, and it optimizes AI workloads for Nvidia.

    Recent investments:

    • Just a few days ago (September 17, 2025), Nvidia invested in U.K. startup Nscale’s data centers.
    • Now (September 18, 2025), a whopping $5B stake in Intel to partner and develop AI processors.

    How Does This Compare to Other Tech Giants?

    So why do companies acquire (buying talent + tech)? Basically, to avoid legal and regulatory risk. Nvidia is not alone; several others are following the same business strategy, like:

    • Meta put $14.3B on Scale AI founder + team, took 49% stake on June 13, 2025.
    • Google spent about $2.4B on the Windsurf founder + team on July 11, 2025.
    • In August 2024, Google acquired the Character.AI team.
    • Microsoft onboarded a team from Inflection AI in March 2024.
    • Amazon hired a team from Adept AI on June 28, 2024.

    Why Is This Important for Nvidia?

    The AI race is picking up its pace, and Nvidia wants to win it with the best chips + best computing talent.

    • Nvidia is making sure:
    • To onboard the best engineers.
    • To own the networking tech and connect big clusters of GPUs.