Ideally the clutch's holding power should be matched to the power you make, with very little reserve. Here's a simplified explanation- let's assume the engine makes 500ft/lbs and the clutch's capacity is 700 ft/lbs before it begins to slip. When you launch the car, that clutch is going to draw 700ft/lbs…the 500ft/lbs that the engine is making at wot plus another 200 ft/lbs of stored inertia energy that will cause the rotating assy to lose rpm. That extra 200 ft/lbs makes the launch more violent, but as soon engine rpm is drawn down to the point that engine rpm sync's up with vehicle speed, rpm ceases to drop and that transfer of an additional 200ft/lbs of inertia energy stops. The downside is that after you have lost the rpm and used that inertia energy, that spent energy then has to be paid back in full before the engine can recover the rpm that it lost. That inertia energy transfer which made the car launch harder initially now slows the car, as it reverses and some of the engine's power must be used to recharge spent inertia energy back into the rotating assy. In the end, that temporary 200 ft/lb boost did not actually net you any performance gain.
Why subject your transmission and drivetrain to that extra 200 ft/lbs if it doesn't net you anything?
What if that extra 200 ft/lbs of holding power gets you a broken transmission?
If a clutch with only 600 ft/lbs of capacity were used it would slip roughly twice as long, which means the car would be traveling faster at the point where rpm and vehicle speed finally sync up...much less bog. Not only does the transmission see less abuse, but the engine doesn't lose as many rpm after launch and after the shifts...the engine will be pulling from a higher average rpm where it makes more power.