Replace pure python loops with numpy where possible in channels.py.#5839
Replace pure python loops with numpy where possible in channels.py.#5839CirqBot merged 3 commits intoquantumlib:masterfrom
Conversation
viathor
left a comment
There was a problem hiding this comment.
I have some minor suggestions, but this is very nice!
| c += np.outer(v, v.conj()) | ||
| return c | ||
| k = np.asarray(kraus_operators) | ||
| flat_ops = k.reshape((-1, d)) |
There was a problem hiding this comment.
I think you can achieve the effect of asarray followed by reshape with reshape alone
k = np.reshape(kraus_operators, (-1, d))
return np.einsum(..., k, k.conj())| c += np.outer(v, v.conj()) | ||
| return c | ||
| k = np.asarray(kraus_operators) | ||
| flat_ops = k.reshape((-1, d)) |
There was a problem hiding this comment.
As a matter of principle, I think we should prefer errors to be raised as soon as they occur and not propagate far away from the cause since if that happens the error is harder to diagnose and fix.
For this reason, I'd rather we be explicit about the expected array shape and say reshape((len(kraus_operators), d)) rather than reshape((-1, d)).
BTW: This -1 hides a concept that has a name: it is the Choi rank (assuming we were indeed pasted in a Kraus, i.e. minimal representation)! You can use this to make the code read well and (integrating the other comment) say:
choi_rank = len(kraus_operators)
k = np.reshape(kraus_operators, (choi_rank, d))
return np.einsum('bi,bj->ij',, k, k.conj())This makes it clear that b index in the einsum ranges over the Kraus operators.
Boosts speed.