Consider the following random process which is defined on $n$ numbers $0\leq x_1,\ldots,x_n\leq 1$:
At each step, pick an arbitrary number, say $x_i$. Then randomly (and independently) change its value to $x'_i$ such that $0\leq x'_i \leq 1$ and $E[x'_i]=x_i$. We stop the process when all the numbers are either $0$ or $1$.
Given that the process is stopped, it is easy to verify that Chernoff bounds hold in the following sense: Let the outcome of the process be the binary numbers $y_1,\ldots,y_n$. For any subset $S$ of indices define $\mu_S=\sum_{i\in S} x_i$, then we have \begin{align*} \Pr\left\{\biggm|\mu_S - \sum_{i\in S} y_i\biggm| >\epsilon \cdot \mu_S \right\} \leq 2e^{-\epsilon^2 \mu_S /3}. \end{align*}
I need to find a similar bound for a slightly more complicated process:
At each step, pick a subset of numbers of size at most $d$, say $\{x_{i_1},\ldots,x_{i_d}\}$, and change their values to $\{x'_{i_1},\ldots,x'_{i_d}\}$ by either increasing all of these numbers or decreasing all of them. This is done in such a way that $0\leq x'_{i_j}\leq 1$ and $E[x'_{i_j}]=x_{i_j}$ for all $j$.
Can we get a similar result for this process too? For instance, is it possible to replace the right-hand side of the above bound with $2e^{-\epsilon^2 \mu/O(d)}$? Thank you very much.