Hi,
It appears that torch.probability simply uses softmax for the simplex bijector.
Is there a reason our simplex transform is much more complicated?
I was also thinking about a GPU-friendly implementation, which the current implementation appears hard do.