However, when the engine inertia is larger than the strain inertia, the electric motor will require more power than is otherwise necessary for this application. This improves costs since it requires spending more for a motor that’s larger than necessary, and since the increased power consumption requires higher operating costs. The solution is to use a gearhead to match the inertia of the electric motor to the inertia of the load.
Recall that inertia is a measure of an object’s level of resistance to change in its motion and is a function of the object’s mass and shape. The higher an object’s inertia, the more torque is needed to accelerate or decelerate the object. This means that when the strain inertia is much bigger than the electric motor inertia, sometimes it can cause extreme overshoot or enhance settling times. Both circumstances can decrease production line throughput.
Inertia Matching: Today’s servo motors are producing more torque relative to frame size. That’s due to dense copper windings, lightweight materials, and high-energy precision gearbox magnets. This creates better inertial mismatches between servo motors and the loads they want to move. Using a gearhead to better match the inertia of the electric motor to the inertia of the load allows for utilizing a smaller motor and outcomes in a more responsive system that is easier to tune. Again, that is achieved through the gearhead’s ratio, where the reflected inertia of the load to the motor is decreased by 1/ratio^2.
As servo technology has evolved, with manufacturers making smaller, yet better motors, gearheads have become increasingly essential companions in motion control. Locating the optimal pairing must take into account many engineering considerations.
So how will a gearhead start providing the energy required by today’s more demanding applications? Well, that goes back again to the basics of gears and their capability to modify the magnitude or direction of an applied power.
The gears and number of teeth on each gear create a ratio. If a electric motor can generate 20 in-lbs. of torque, and a 10:1 ratio gearhead is mounted on its result, the resulting torque will be close to 200 in-pounds. With the ongoing focus on developing smaller sized footprints for motors and the equipment that they drive, the capability to pair a smaller engine with a gearhead to achieve the desired torque result is invaluable.
A motor could be rated at 2,000 rpm, however your application may just require 50 rpm. Attempting to run the motor at 50 rpm might not be optimal predicated on the following;
If you are working at a very low speed, such as for example 50 rpm, as well as your motor feedback resolution is not high enough, the update rate of the electronic drive may cause a velocity ripple in the application form. For instance, with a motor feedback resolution of 1 1,000 counts/rev you have a measurable count at every 0.357 amount of shaft rotation. If the digital drive you are using to control the motor includes a velocity loop of 0.125 milliseconds, it will look for that measurable count at every 0.0375 amount of shaft rotation at 50 rpm (300 deg/sec). When it does not find that count it will speed up the motor rotation to find it. At the velocity that it finds another measurable count the rpm will end up being too fast for the application form and the drive will gradual the electric motor rpm back down to 50 rpm and then the whole process starts all over again. This continuous increase and reduction in rpm is exactly what will cause velocity ripple in an application.
A servo motor running at low rpm operates inefficiently. Eddy currents are loops of electrical current that are induced within the motor during operation. The eddy currents actually produce a drag pressure within the electric motor and will have a greater negative effect on motor efficiency at lower rpms.
An off-the-shelf motor’s parameters may not be ideally suited to run at a minimal rpm. When an application runs the aforementioned engine at 50 rpm, essentially it is not using most of its offered rpm. Because the voltage continuous (V/Krpm) of the motor is set for a higher rpm, the torque constant (Nm/amp), which is directly linked to it-is usually lower than it needs to be. Consequently the application needs more current to operate a vehicle it than if the application had a motor specifically created for 50 rpm.
A gearheads ratio reduces the engine rpm, which explains why gearheads are sometimes called gear reducers. Using a gearhead with a 40:1 ratio, the engine rpm at the insight of the gearhead will be 2,000 rpm and the rpm at the output of the gearhead will end up being 50 rpm. Working the electric motor at the higher rpm will allow you to avoid the concerns mentioned in bullets 1 and 2. For bullet 3, it allows the design to use less torque and current from the motor predicated on the mechanical advantage of the gearhead.