MAINT: Use vectorization in plot_trisurf, simplifying greatly#9991
MAINT: Use vectorization in plot_trisurf, simplifying greatly#9991anntzer merged 1 commit intomatplotlib:masterfrom
Conversation
5d63231 to
5e1b08b
Compare
eric-wieser
left a comment
There was a problem hiding this comment.
codecov/patch is picking up the fact that shade=True was not tested before this patch, and that all the lines that were tested have been made a minority due to being combined into one.
| if shade: | ||
| v1 = verts[:,0,:] - verts[:,1,:] | ||
| v2 = verts[:,1,:] - verts[:,2,:] | ||
| normals = np.cross(v1, v2) |
There was a problem hiding this comment.
If I combined these into one less-readable line, then your code coverage tests would pass. Aren't metrics great?
There was a problem hiding this comment.
The other way of looking at that is the coverage tests would pass if this bit of code was tested!
There was a problem hiding this comment.
It's not as if we really cared about what codecov says...
There was a problem hiding this comment.
@dstansby: I suppose the way of looking at it is not "here's a dice that's rolled which will randomly reject your PR", but "here's a dice that when rolled might let you off a test".
I've added the test now :)
QuLogic
left a comment
There was a problem hiding this comment.
There are a couple PEP8 issues, though because it's disabled for this (extremely inconsistent) file, there are no complaints from Travis, but that doesn't mean we can't fix them when modifying the file.
Otherwise, I think this PR is good.
lib/mpl_toolkits/mplot3d/axes3d.py
Outdated
| colset = np.array(colset) | ||
| polyc.set_array(colset) | ||
| # average over the three points of each triangle | ||
| avg_z = verts[:,:,2].mean(axis=1) |
There was a problem hiding this comment.
I always seem to go with omitting them in indexing expressions. But I guess PEP8 doesn't call out indexing with a tuple specifically, and if that's the style matplotlib is (trying to be) using, then I'll go with it.
lib/mpl_toolkits/mplot3d/axes3d.py
Outdated
| else: | ||
| if shade: | ||
| v1 = verts[:,0,:] - verts[:,1,:] | ||
| v2 = verts[:,1,:] - verts[:,2,:] |
5e1b08b to
f0077d2
Compare
|
@QuLogic: Updated |
f0077d2 to
cb05203
Compare
|
This seems a bit bigger than when I first reviewed it? |
| normals = np.cross(v1, v2) | ||
| else: | ||
| normals = [] | ||
| # verts = np.stack((xt, yt, zt), axis=-1) |
There was a problem hiding this comment.
Are they 1D or 2D? np.vstack or np.dstack could also be used in those cases.
There was a problem hiding this comment.
Yep, this was a new change since you last reviewed. Both vstack and dstack have undesirable semantics of guessing the array .ndim, so using concatenate is more precise.
This isn't the only place that np.stack is mentioned in a comment above a np.concatenate((a[..., None], ...)), so might be worth backporting to the very simple
def stack(arrays, axis):
return np.concatenate([arr[..., np.newaxis] for arr in arrays], axis)
- (oops, not correct)
Not something I want to involve in this PR though
There was a problem hiding this comment.
You can just stick the definition in https://github.com/matplotlib/matplotlib/blob/master/lib/matplotlib/cbook/_backports.py if you change your mind.
There was a problem hiding this comment.
Oh, and to answer your question, 2D - (N, edges_in_triangle)
There was a problem hiding this comment.
We do have a cbook._backports; not sure how strong the need is for it without having done any looking.
There was a problem hiding this comment.
I reckon there're a substantial number of cases - I can put together a PR once this is merged
There was a problem hiding this comment.
Looking again, a stack_last function is all that ever seems to be needed, implemented as the above - so that wouldn't belong in _backports. Suggestions of where to put such a helper?
There was a problem hiding this comment.
cbook would be the place.
But given that we'd probably just be using stack(..., -1) "if it was available", I'd just backport stack.
Let us know if you want to do this in this PR or a separate one.
There was a problem hiding this comment.
Definitely a separate one
cb05203 to
100f167
Compare
Use it to compute surface normals and mean z coord, rather than manual looping
100f167 to
e36bb0b
Compare
Cleanup only, no behavior changes.
I'd expect considerably better performance on large arrays, since there are fewer copies being made, and the operations can be vectorized