Richard Ehrenberg
Simple question — complex answer!
The time when the greatest demands are made on the ignition system are during cold-weather cranking. Under these conditions, with engine oil that may be molassas-like, and a battery with maybe 50% of normal-weather cranking capacity, the voltage available to the ignition system may be as low as 8 volts. If the system were designed for 12-13 volts, the output (spark) at 8V would be marginal. And, at low temps, with who-knows-what for an A/F mixture, the hottest spark possible is needed to fire off the system.
What the engineers (mainly Charles Kettering / Delco) devised to fix this was a system optimized for 8 – 9 volts of battery potential. If, however, this were operated on the full, normal ~13V, it would fry quickly, especially the coil and points. Hence, the ballast resistor. What it does, during normal operation, is drop the battery voltage to the required 8 – 9V. It does nothing during cranking – the ignition switch bypasses it.
You didn’t say if your ignition system was 100% stock. If so — including a stock-spec coil — I can only assume you have been using garbagy Chinese ballasts. The stockers lasted half a century, normally. Also be aware that many ballast are changed in vain, when the trouble lies elsewhere — commonly, bulkhead connector problems.
If you have swapped in a high-output coil, that can pull too much current through the ballast, cooking it. The fix is a lower-resistance ballast (the stocker was 0.5 to 0.6 ohms). However, this will reduce breaker point longevity by a significant amount.
If you have swapped to Chrysler-style (1973) electronic ignition, much of the above still applies. If you have an aftermarket system, you’re on your own!
Rick