I think it would be best if this could be integrated with the existing displacement system somehow. A design principle in the shading nodes we try to follow is to keep the specific render algorithm separate from the description of the surface shading as much as possible, and we only make exceptions when there is no way around it. It also makes material interchange easier if things are decoupled, both between Cycles and Eevee and other applications.
The problem is indeed that with standard POM you can really only use a single UV map as a texture coordinate, anything else like multiple UV maps or generated coordinates do not work. If they could be made to work that would be ideal, but I’m not sure how it would be done exactly.
If that limitation remains, it becomes a matter of UI/UX design. Both with and without a POM node the fact that e.g. a Noise Texture node will not work correctly by default will not be obvious to users, and you have to read the docs to understand it. Having to wire the UV output of the POM node to texture nodes makes it a bit more explicit, but is also inconvenient. I don’t know immediately how we could communicate this well in the shader node interface directly with or without a POM node.
I don’t think this should be implemented in Cycles. It’s not a technique that’s great for path tracing, and it complicates the Cycles kernel. Especially when it comes to further improvements like better shadows or shell geometry, it’s not a direction I want to go in, but rather focus on improving the adaptive subdivision and displacement system rather than adopting game engine techniques.