Hello everyone,
I'm working on a class AB push pull amplifier design. For reason of the widely varying load impedance of the output stage inducing substantial voltage sag in the B+ power supply (presumably), I've explored means of regulating the B+. Put differently, I don't like how high the output impedance of my power supply is. My current concept is to put in place a linear power MOSFET series regulator with the voltage reference established by a pair of VR tubes in series. This component of the design is in a state that I'm at least game to try it, although a substantial heat sink enters the scene for the power MOSFET

. Also, I'd venture a guess that this amplifier is about the upper boundary of power for this strategy to be viable (10-14W)
Basic design highlights:
-6V6 AB1 output stage, combination of fixed bias and cathode bias
-Concertina phase splitter, cathode biased
-Cathode biased 12AX7 input stage
However, after some casual internet research, I've seen a few cautionary notes about combining regulated B+ with certain bias strategies and a more common emphasis towards regulating pentode screen grid voltages than B+. The basic saying seems to go "regulate the B+ and have a fixed bias with regulated supply or regulate neither". I think I follow the logic behind avoiding a regulated bias voltage supply if the B+ is regulated, but in the converse, where B+ is regulated does the fixed bias voltage supply need to be regulated? Or more simplistically, can any of the stages be cathode biased if their plate voltages are regulated? I've got no issues with regulating the fixed bias component of the output stage. Rather, I'm wondering if I need to go with full fixed bias for the output stage and find a different means of biasing the concertina and the input stage (both cathode biased).
Also, will the ultra-linear configuration interact unfavorably with a regulated B+?
Thanks!