I'm looking for neural networks $n_\theta(x)$ which integrate to a constant $\int_{[0,1]^d} n(x)\; dx=c\in\mathbb{R}$. Notably, the constant should be the same for any weights $\theta$ for which $n_\theta(x)$ is finite.
Example. Normalizing flows use neural networks $n(x)$ to represent a probability density function $p_n(x)$ which satisfies $\int_{ [0,1]^d}p_d(x)\; dx=1$. This is also true for autoregressive models like PixelCNN.
It would also be interesting if the integral is something one can compute using a formula. Especially if is entails less architectural constraints than Normalizing flows and PixelCNN.
Clarification 1. While the example uses neural networks that represent probability density functions, I do not want that, it was just the only way I knew how to make the integral tractable.
Clarification 2. I'm not considering a particular architecture. I want to find an architectural constraint that guarantees the integral is tractable. The "weaker" the architectural constraint the better.
Bonus example. Additive coupling layers are volume preserving so $\int_{x\in D} n(x) \; dx= \int_{x\in D}x\; dx$.