Short answer: No
The constants are part of the algorithm. If you don't trust the constants, then don't trust the algorithm. It is worth looking at some examples.
Merkle-Damgård hashes
One constant that could be changed in such an algorithm is the IV. For security analysis, changing the IV is equivalent to prepending the data with one block of key material. But the prepending would be preferred due to not needing to change the actual algorithm.
However it is already known, that just prepending some key material is not a good way to construct a keyed hash. A better approach would be to use a HMAC construction.
The IV is not the only constant in a Merkle-Damgård hash. There are constants in the compression function as well. If you choose your own constants, there is significant risk, that they would be weak for some reason. So even if all research point at the original being secure, your version might be weak. Should a weakness be found in the original algorithm, it is unlikely you managed to come up with better constants.
In fact in order to come up with better constants in that case, you'd basically have had to find the weakness and a solution. In other words if you could not write a paper about why your constants are better than the originals and get it accepted in a journal about cryptography, then your constants aren't any better.
DECDRBG
This is a good example of not trusting the constants. The original constants came with no explanation of how they were constructed. That is a warning sign, and those constants shouldn't be trusted.
Further analysis of how those constants could have been constructed revealed a possible backdoor. The entire algorithm is now considered suspect.
If you wanted to use that algorithm, you shouldn't be using the original untrusted constants. But if you generated your own constants, how could you make anybody else trust your constants?
In fact with the knowledge we have now, DECDRBG with nonstandard constants is one of the simplest way to create a cryptographic algorithm with your own backdoor and make it look like an innocent mistake.
DECDRBG shows why once there is justified skepticism about the constants in an algorithm, you shouldn't trust the algorithm until somebody come up with a real criteria to decide what is good constants for the algorithm.