The chip world is shifting beneath our feet, and for MediaTek, the move isn’t incremental — it’s bold. The company recently announced that it aims to generate billions in revenue from AI accelerator chips by 2027.
Why this matters
MediaTek has long been known for smartphone system-on-chips. But now the company is venturing into the AI infrastructure domain, specifically into ASIC-based AI accelerator chips for data centres. The target? Not just millions, but billions in revenue by 2027. According to the company’s CEO, MediaTek expects around US$1 billion from cloud AI chips by 2026, followed by “multiple billions” in 2027.
That shift signals how aggressively the semiconductor industry is pivoting toward artificial-intelligence hardware. For MediaTek, it’s about stepping up from mobile components to playing in the core of the AI compute stack.
The numbers and ambition
MediaTek says the total addressable market (TAM) for data-centre AI ASIC chips is estimated at about US$50 billion, and it plans to grab at least 10–15% of that within the next two years.
Here are some key targets:
- Revenue of about US$1 billion in 2026 from its first AI accelerator ASIC project.
- “Multiple billions” in revenue in 2027 from that project.
- Additional projects starting deliveries in 2028 and beyond.
So if they hit the target, MediaTek could be a meaningful new contender in the AI compute hardware arena — not just a smartphone chip designer.
How MediaTek is making the shift
Let’s look at how MediaTek is transforming itself and moving into the ‘AI accelerator chip’ space:
1. Strategic partnerships
MediaTek is collaborating with NVIDIA on co-designing the GB10 Grace Blackwell superchip — used in NVIDIA’s DGX Spark system.
This kind of strategic deal gives MediaTek an entry into the data-centre AI hardware world, leveraging NVIDIA’s ecosystem while bringing its own design capability to bear.
2. Leveraging advanced process nodes
MediaTek is reportedly tapping into cutting-edge chip processes (including 2nm tech) to build energy-efficient AI inference hardware.
By building accelerators on advanced nodes, MediaTek can aim for performance and power-efficiency gains — crucial for AI workloads.
3. Targeting inference and cloud AI workloads
The first project is cloud-oriented: AI accelerator ASICs aimed at data centres. That means high volume, high performance, and high margin potential.
The shift from mobile chips to AI accelerator chips is a major repositioning of MediaTek’s hardware identity.
Market context and competitive pressures
MediaTek isn’t doing this in a vacuum. The AI hardware space is already crowded and rapidly evolving.
- Competitive players like Qualcomm are planning data-centre AI chips for 2026-2027.
- The incumbent giant, NVIDIA, dominates the GPU side of AI compute today, but ASICs and custom accelerators are emerging as alternatives.
- MediaTek must balance R&D costs, supply-chain risks, and timing. The advanced process nodes (2nm, 3nm) are costly and complex.
Given the market size (US$50 billion TAM) and MediaTek’s target share (10–15 %), the ambition is big — and the risks non-trivial.
What must go right
For MediaTek to hit “billions in AI accelerator chip revenue by 2027”, a number of factors need to align:
- Timely product delivery: The accelerator project must ship on schedule, be adopted by cloud/data-centre customers, and scale.
- Strong partner network: Collaboration with NVIDIA and likely others must translate into real production shipments and ecosystem support.
- Manufacturing and cost control: Advanced nodes are expensive; MediaTek must manage yield, cost, and supply chain to maintain margins.
- Meeting performance/power needs: AI workloads demand high compute + low power. If MediaTek’s chips don’t deliver, adoption will lag.
- Competitive differentiation: With others (Qualcomm, etc) entering the space, MediaTek must clearly carve out a niche or win anchor customers.
Why it matters for the wider industry
MediaTek’s ambition signals several broader trends with implications beyond one company:
- AI compute is shifting: From general-purpose GPUs toward more custom ASICs/accelerators — companies want hardware optimized for their use-cases.
- New entrants in infrastructure: A company known for mobile chips enters data-centre AI. That shows the scope of opportunity and the blurring lines between device and infrastructure chips.
- Global chip ecosystem evolution: Taiwan-based MediaTek collaborating with major players, tapping advanced nodes, suggests the global semiconductor race continues evolving.
- Margin and volume potential: Big-scale infrastructure chips mean big revenue potential; but cost, complexity, and risk are also high.
What to watch next
If you’re monitoring MediaTek’s progress (or the AI-chip space broadly), here are key milestones:
- Specific customer wins or design wins announced for their AI accelerator chips.
- First revenue recognition for the project slated in 2026.
- Technology/delivery updates (e.g., node, packaging, performance metrics).
- Supply chain/production risk disclosures (yields, fab agreements).
- Competitive announcements from other players (Qualcomm, others) that might affect MediaTek’s market share.
- Market response: whether MediaTek’s shares or valuation shift in response to these strategic moves.
Conclusion
MediaTek’s plan to aim for “billions in AI accelerator chip revenue by 2027” is a bold leap — from smartphone SoCs to the heart of AI infrastructure. If they pull it off, it could reshape their identity and the competitive landscape in data-centre AI. The company has the ambition, the partnerships, and a credible target. The question now is execution. For investors, industry watchers, and tech strategists, the next 12–24 months will be critical.