Motor Control14 min read2026-02-20

BLDC Motor Driver Selection: How to Match MOSFETs, Gate Drivers, and Control ICs

Designing a BLDC motor drive requires careful matching of power devices, gate drivers, and control algorithms. This guide walks through the key tradeoffs from MOSFET Rds-on to control loop bandwidth for industrial and consumer motor applications.

BLDC Motor Drive Architecture


A modern BLDC motor drive consists of three power stages: a MOSFET-based three-phase inverter, a gate driver IC, and a motor control MCU or dedicated control IC.


MOSFET Selection Criteria


For BLDC drives, MOSFET selection is dominated by three parameters:

  • **Rds(on)**: Determines conduction losses. At 20A motor current, a 2mΩ MOSFET drops 0.8W vs 5mΩ at 2W.
  • **Qg (total gate charge)**: Determines switching speed and driver current requirements.
  • **Vdss (breakdown voltage)**: Must exceed worst-case supply voltage + transients. Use 80% derating: 48V system → 80V MOSFET minimum.

  • Gate Driver Matching


    The gate driver must source/sink enough current to charge/discharge the MOSFET gate in your target switching time:

  • 10ns switching → ~10A peak driver current required
  • Calculate driver strength requirement: Ipeak = Qg / tsw

  • MPS motor driver ICs combine gate drivers with integrated MOSFETs (SmartFET) for currents up to 15A, reducing PCB footprint by 60%.


    Control Loop Bandwidth


    Industrial servo drives typically require 2–5 kHz current loop bandwidth. Key considerations:

  • PWM frequency: 20kHz minimum for audibly silent operation
  • ADC sampling: at least 2× PWM frequency for accurate current measurement
  • Phase current measurement using single-shunt with correct timing