Record flutter run success/fail, build mode, platform, start time in analytics#9597
Record flutter run success/fail, build mode, platform, start time in analytics#9597xster merged 16 commits intoflutter:masterfrom
flutter run success/fail, build mode, platform, start time in analytics#9597Conversation
flutter run success/fail, build mode, start time in analyticsflutter run success/fail, build mode, platform, start time in analytics
|
Added a clock to the testing context |
tvolkert
left a comment
There was a problem hiding this comment.
Travis and AppVeyor seem unhappy.
There was a problem hiding this comment.
Chaining the .. with the .then() makes this hard to immediately grok - maybe split it into distinct statements?
There was a problem hiding this comment.
The formatting style for this is:
FlutterCommandResult(
this.exitCode, {
this.analyticsParameters,
this.exitTime,
}) {
assert(exitCode != null);
}
There was a problem hiding this comment.
final on all three of these.
There was a problem hiding this comment.
Nit: newline between field declarations.
There was a problem hiding this comment.
Maybe explain that FlutterCommand is what's doing the measurement so the reader knows where this automatic calculation will be made. Something like "If this is not provided, [FlutterCommand] will default to the command's completion time when calculating the command's running time"
There was a problem hiding this comment.
Is there a use case for commands providing their own entry time as well?
There was a problem hiding this comment.
No. I think we should measure user perceivable time. That latency measurement should start as early as possible.
There was a problem hiding this comment.
By starting this here, we'll record the time it takes to update the cache if needed (which is done in verifyThenRunCommand()), which will mess with the analytics results...
There was a problem hiding this comment.
Ditto. I think that's fine as a metric to optimize for.
There was a problem hiding this comment.
Is this used anymore? It looks like UsageTimer can be nuked.
There was a problem hiding this comment.
Do you mean MockClock (or FakeClock given my comment below)?
There was a problem hiding this comment.
Ah ya. I was gonna switch everything without breaking the existing tests but there were too many. Done. We can swap the others as needed.
There was a problem hiding this comment.
It's such a simple interface, it may be worth writing a FakeClock and saving your tests the effort of mocking... It could have a setter for setting the current time, and it'd increment the value every time now() was called.
There was a problem hiding this comment.
I think it's ok for the test to be verbose wrt the input/output near the assertions.
There was a problem hiding this comment.
endTime? endTimeOverride?
There was a problem hiding this comment.
Sure. Renamed to endTimeOverride
There was a problem hiding this comment.
nit: I think flutter style would want single quotes here
There was a problem hiding this comment.
There's a utility to print ms - getElapsedAsMilliseconds(Duration). Generally used for user facing output (non-trace output), but perhap useful here.
There was a problem hiding this comment.
you don't need the await here, since you're now returning the Future
There was a problem hiding this comment.
I think semantically, we don't need to await here
There was a problem hiding this comment.
Nit: for _ parameters, you can omit the type.
There was a problem hiding this comment.
I'm confused as to what this is trying to do. Wouldn't scheduleMicrotask do the same thing cleanly? Or Future.sync?
There was a problem hiding this comment.
oops, our comments crossed paths midair.
I didn't really understand the references to Future in the Completer's dartdoc either. It seems to be written for the perspective of the Completer's reader rather than the Completer's supplier.
I'm trying to supply a Completer such that a piece of code runs synchronously when the owner of the Completer, a few layers deeper, calls complete() on it.
There was a problem hiding this comment.
I'm not sure what "supply a completer" means. Normally a completer is private to the person who creates it, and you hand out its future.
There was a problem hiding this comment.
@Hixie I had the same reaction, but it looks like it's an existing API in tools. I plan to remedy this in a separate PR by having the caller pass a callback instead of a completer.
There was a problem hiding this comment.
cool neat. It doesn't actually seem heavily used in the codebase. A simpler callback would indeed be easier to parse.
There was a problem hiding this comment.
If you're synced to tip-of-tree, this now needs to be await device.targetPlatform
There was a problem hiding this comment.
Likewise, await device.isLocalEmulator
There was a problem hiding this comment.
Similar formatting comment here:
Duration duration, {
String label,
There was a problem hiding this comment.
We used to check for suppressAnalytics...
There was a problem hiding this comment.
MockUsage and MockClock are seeded in the context for tests by default, so you don't need to instantiate and override them here. (same comment in the tests below)
There was a problem hiding this comment.
so I ended up defaulting back to a real clock. I think it's more convenient as we move more and more stuff to Clock for tests that don't really care to keep working without null handling and not get an arbitrary value.
|
Ah, I was wondering why travis was failing. Rebased. Also TIL quiver. Removed Clock since there already is one and used some of the string functions |
| String label, | ||
| }) { | ||
| if (!suppressAnalytics) { | ||
| _analytics.sendTiming( |
flutter runto the app startflutter runsuccess/fail, build mode, hot/cold, platform, emulator.Fixes #9500
Fixes #9499
Fixes #9498