Skip to content
This repository was archived by the owner on Feb 22, 2023. It is now read-only.

[e2e] Add new e2e_driver for handling response data and performance watcher #2906

Merged
merged 6 commits into from
Aug 5, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions packages/e2e/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
## 0.6.3

* Add customizable `flutter_driver` adaptor.
* Add utilities for tracking frame performance in an e2e test.

## 0.6.2+1

* Fix incorrect test results when one test passes then another fails
Expand Down
2 changes: 1 addition & 1 deletion packages/e2e/example/test_driver/example_e2e_test.dart
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@ import 'dart:async';

import 'package:e2e/e2e_driver.dart' as e2e;

Future<void> main() async => e2e.main();
Future<void> main() async => e2e.e2eDriver();
80 changes: 76 additions & 4 deletions packages/e2e/lib/e2e_driver.dart
Original file line number Diff line number Diff line change
@@ -1,21 +1,93 @@
// Copyright 2014 The Flutter Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.

import 'dart:async';
import 'dart:convert';
import 'dart:io';

import 'package:e2e/common.dart' as e2e;
import 'package:flutter_driver/flutter_driver.dart';

Future<void> main() async {
import 'package:e2e/common.dart' as e2e;
import 'package:path/path.dart' as path;

/// This method remains for backword compatibility.
Future<void> main() => e2eDriver();

/// Flutter Driver test output directory.
///
/// Tests should write any output files to this directory. Defaults to the path
/// set in the FLUTTER_TEST_OUTPUTS_DIR environment variable, or `build` if
/// unset.
String testOutputsDirectory =
Platform.environment['FLUTTER_TEST_OUTPUTS_DIR'] ?? 'build';

/// The callback type to handle [e2e.Response.data] after the test succcess.
typedef ResponseDataCallback = FutureOr<void> Function(Map<String, dynamic>);

/// Writes a json-serializable json data to to
/// [testOutputsDirectory]/`testOutputFilename.json`.
///
/// This is the default `responseDataCallback` in [e2eDriver].
Future<void> writeResponseData(
Map<String, dynamic> data, {
String testOutputFilename = 'e2e_response_data',
String destinationDirectory,
}) async {
assert(testOutputFilename != null);
destinationDirectory ??= testOutputsDirectory;
await fs.directory(destinationDirectory).create(recursive: true);
final File file = fs.file(path.join(
destinationDirectory,
'$testOutputFilename.json',
));
final String resultString = _encodeJson(data, true);
await file.writeAsString(resultString);
}

/// Adaptor to run E2E test using `flutter drive`.
///
/// `timeout` controls the longest time waited before the test ends.
/// It is not necessarily the execution time for the test app: the test may
/// finish sooner than the `timeout`.
///
/// `responseDataCallback` is the handler for processing [e2e.Response.data].
/// The default value is `writeResponseData`.
///
/// To an E2E test `<test_name>.dart` using `flutter drive`, put a file named
/// `<test_name>_test.dart` in the app's `test_driver` directory:
///
/// ```dart
/// import 'dart:async';
///
/// import 'package:e2e/e2e_driver.dart' as e2e;
///
/// Future<void> main() async => e2e.e2eDriver();
///
/// ```
Future<void> e2eDriver({
Duration timeout = const Duration(minutes: 1),
ResponseDataCallback responseDataCallback = writeResponseData,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default to write response.data into e2e_perf_summary.json seems a little strange as it doesn't necessarily have perf summary data. Shall we default to not write any file, and provide a perfResponseCallback that writes response.data['performance'] into e2e_perf_summary.json? That seems a little more helpful in updating your current code in https://github.com/flutter/flutter/pull/61509/files.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I probably should change the file name to some more general ones. Getting performance entry is just one more line like in https://github.com/flutter/flutter/pull/62064/files#diff-73e66281b14c792afcdcd20acb9edd92R11 while I'm imagining this e2eDriver will be used in a broader context, e.g. reporting timeline as in flutter/flutter#58789 , so I'm trying to make this function useful independent of the performance test context while still being useful in my project.

Copy link
Contributor Author

@CareF CareF Aug 5, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As for by default not writing anything, writeResponseData will look like an unused method. Maybe I should add an example to e2e as for demo the usage of this PR. If we decide to go that direction I'll open a new PR after this lands for that.

}) async {
final FlutterDriver driver = await FlutterDriver.connect();
final String jsonResult =
await driver.requestData(null, timeout: const Duration(minutes: 1));
final String jsonResult = await driver.requestData(null, timeout: timeout);
final e2e.Response response = e2e.Response.fromJson(jsonResult);
await driver.close();

if (response.allTestsPassed) {
print('All tests passed.');
if (responseDataCallback != null) {
await responseDataCallback(response.data);
}
exit(0);
} else {
print('Failure Details:\n${response.formattedFailureDetails}');
exit(1);
}
}

const JsonEncoder _prettyEncoder = JsonEncoder.withIndent(' ');

String _encodeJson(Map<String, dynamic> jsonObject, bool pretty) {
return pretty ? _prettyEncoder.convert(jsonObject) : json.encode(jsonObject);
}
199 changes: 199 additions & 0 deletions packages/e2e/lib/e2e_perf.dart
Original file line number Diff line number Diff line change
@@ -0,0 +1,199 @@
// Copyright 2014 The Flutter Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.

import 'dart:async';
import 'dart:ui';

import 'package:flutter/scheduler.dart';
import 'package:flutter_test/flutter_test.dart';
import 'package:flutter/widgets.dart';

import 'package:e2e/e2e.dart';

/// The maximum amount of time considered safe to spend for a frame's build
/// phase. Anything past that is in the danger of missing the frame as 60FPS.
///
/// Changing this doesn't re-evaluate existing summary.
Duration kBuildBudget = const Duration(milliseconds: 16);
// TODO(CareF): Automatically calculate the refresh budget (#61958)

bool _firstRun = true;

/// The warning message to show when a benchmark is performed with assert on.
/// TODO(CareF) remove this and update pubspect when flutter/flutter#61509 is
/// in released version.
const String kDebugWarning = '''
┏╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍┓
┇ ⚠ THIS BENCHMARK IS BEING RUN IN DEBUG MODE ⚠ ┇
┡╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍╍┦
│ │
│ Numbers obtained from a benchmark while asserts are │
│ enabled will not accurately reflect the performance │
│ that will be experienced by end users using release ╎
│ builds. Benchmarks should be run using this command ╎
│ line: "flutter run --profile test.dart" or ┊
│ or "flutter drive --profile -t test.dart". ┊
│ ┊
└─────────────────────────────────────────────────╌┄┈ 🐢
''';

/// watches the [FrameTiming] of `action` and report it to the e2e binding.
Future<void> watchPerformance(
E2EWidgetsFlutterBinding binding,
Future<void> action(), {
String reportKey = 'performance',
}) async {
assert(() {
if (_firstRun) {
debugPrint(kDebugWarning);
_firstRun = false;
}
return true;
}());
final List<FrameTiming> frameTimings = <FrameTiming>[];
final TimingsCallback watcher = frameTimings.addAll;
binding.addTimingsCallback(watcher);
await action();
binding.removeTimingsCallback(watcher);
final FrameTimingSummarizer frameTimes = FrameTimingSummarizer(frameTimings);
binding.reportData = <String, dynamic>{reportKey: frameTimes.summary};
}

/// This class and summarizes a list of [FrameTiming] for the performance
/// metrics.
class FrameTimingSummarizer {
/// Summarize `data` to frame build time and frame rasterizer time statistics.
///
/// See [TimelineSummary.summaryJson] for detail.
factory FrameTimingSummarizer(List<FrameTiming> data) {
assert(data != null);
assert(data.isNotEmpty);
final List<Duration> frameBuildTime = List<Duration>.unmodifiable(
data.map<Duration>((FrameTiming datum) => datum.buildDuration),
);
final List<Duration> frameBuildTimeSorted =
List<Duration>.from(frameBuildTime)..sort();
final List<Duration> frameRasterizerTime = List<Duration>.unmodifiable(
data.map<Duration>((FrameTiming datum) => datum.rasterDuration),
);
final List<Duration> frameRasterizerTimeSorted =
List<Duration>.from(frameRasterizerTime)..sort();
final Duration Function(Duration, Duration) add =
(Duration a, Duration b) => a + b;
return FrameTimingSummarizer._(
frameBuildTime: frameBuildTime,
frameRasterizerTime: frameRasterizerTime,
// This avarage calculation is microsecond precision, which is fine
// because typical values of these times are milliseconds.
averageFrameBuildTime: frameBuildTime.reduce(add) ~/ data.length,
p90FrameBuildTime: _findPercentile(frameBuildTimeSorted, 0.90),
p99FrameBuildTime: _findPercentile(frameBuildTimeSorted, 0.99),
worstFrameBuildTime: frameBuildTimeSorted.last,
missedFrameBuildBudget: _countExceed(frameBuildTimeSorted, kBuildBudget),
averageFrameRasterizerTime:
frameRasterizerTime.reduce(add) ~/ data.length,
p90FrameRasterizerTime: _findPercentile(frameRasterizerTimeSorted, 0.90),
p99FrameRasterizerTime: _findPercentile(frameRasterizerTimeSorted, 0.99),
worstFrameRasterizerTime: frameRasterizerTimeSorted.last,
missedFrameRasterizerBudget:
_countExceed(frameRasterizerTimeSorted, kBuildBudget),
);
}

const FrameTimingSummarizer._({
@required this.frameBuildTime,
@required this.frameRasterizerTime,
@required this.averageFrameBuildTime,
@required this.p90FrameBuildTime,
@required this.p99FrameBuildTime,
@required this.worstFrameBuildTime,
@required this.missedFrameBuildBudget,
@required this.averageFrameRasterizerTime,
@required this.p90FrameRasterizerTime,
@required this.p99FrameRasterizerTime,
@required this.worstFrameRasterizerTime,
@required this.missedFrameRasterizerBudget,
});

/// List of frame build time in microseconds
final List<Duration> frameBuildTime;

/// List of frame rasterizer time in microseconds
final List<Duration> frameRasterizerTime;

/// The average value of [frameBuildTime] in milliseconds.
final Duration averageFrameBuildTime;

/// The 90-th percentile value of [frameBuildTime] in milliseconds
final Duration p90FrameBuildTime;

/// The 99-th percentile value of [frameBuildTime] in milliseconds
final Duration p99FrameBuildTime;

/// The largest value of [frameBuildTime] in milliseconds
final Duration worstFrameBuildTime;

/// Number of items in [frameBuildTime] that's greater than [kBuildBudget]
final int missedFrameBuildBudget;

/// The average value of [frameRasterizerTime] in milliseconds.
final Duration averageFrameRasterizerTime;

/// The 90-th percentile value of [frameRasterizerTime] in milliseconds.
final Duration p90FrameRasterizerTime;

/// The 99-th percentile value of [frameRasterizerTime] in milliseconds.
final Duration p99FrameRasterizerTime;

/// The largest value of [frameRasterizerTime] in milliseconds.
final Duration worstFrameRasterizerTime;

/// Number of items in [frameRasterizerTime] that's greater than [kBuildBudget]
final int missedFrameRasterizerBudget;

/// Convert the summary result to a json object.
///
/// See [TimelineSummary.summaryJson] for detail.
Map<String, dynamic> get summary => <String, dynamic>{
'average_frame_build_time_millis':
averageFrameBuildTime.inMicroseconds / 1E3,
'90th_percentile_frame_build_time_millis':
p90FrameBuildTime.inMicroseconds / 1E3,
'99th_percentile_frame_build_time_millis':
p99FrameBuildTime.inMicroseconds / 1E3,
'worst_frame_build_time_millis':
worstFrameBuildTime.inMicroseconds / 1E3,
'missed_frame_build_budget_count': missedFrameBuildBudget,
'average_frame_rasterizer_time_millis':
averageFrameRasterizerTime.inMicroseconds / 1E3,
'90th_percentile_frame_rasterizer_time_millis':
p90FrameRasterizerTime.inMicroseconds / 1E3,
'99th_percentile_frame_rasterizer_time_millis':
p99FrameRasterizerTime.inMicroseconds / 1E3,
'worst_frame_rasterizer_time_millis':
worstFrameRasterizerTime.inMicroseconds / 1E3,
'missed_frame_rasterizer_budget_count': missedFrameRasterizerBudget,
'frame_count': frameBuildTime.length,
'frame_build_times': frameBuildTime
.map<int>((Duration datum) => datum.inMicroseconds)
.toList(),
'frame_rasterizer_times': frameRasterizerTime
.map<int>((Duration datum) => datum.inMicroseconds)
.toList(),
};
}

// The following helper functions require data sorted

// return the 100*p-th percentile of the data
T _findPercentile<T>(List<T> data, double p) {
assert(p >= 0 && p <= 1);
return data[((data.length - 1) * p).round()];
}

// return the number of items in data that > threshold
int _countExceed<T extends Comparable<T>>(List<T> data, T threshold) {
return data.length -
data.indexWhere((T datum) => datum.compareTo(threshold) > 0);
}
3 changes: 2 additions & 1 deletion packages/e2e/pubspec.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: e2e
description: Runs tests that use the flutter_test API as integration tests.
version: 0.6.2+1
version: 0.6.3
homepage: https://github.com/flutter/plugins/tree/master/packages/e2e

environment:
Expand All @@ -14,6 +14,7 @@ dependencies:
sdk: flutter
flutter_test:
sdk: flutter
path: ^1.6.4

dev_dependencies:
pedantic: ^1.8.0
Expand Down
35 changes: 35 additions & 0 deletions packages/e2e/test/frame_timing_summarizer_test.dart
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
import 'dart:ui';

import 'package:flutter_test/flutter_test.dart';

import 'package:e2e/e2e_perf.dart';

void main() {
test('Test FrameTimingSummarizer', () {
List<int> buildTimes = <int>[
for (int i = 1; i <= 100; i += 1) 1000 * i,
];
buildTimes = buildTimes.reversed.toList();
List<int> rasterTimes = <int>[
for (int i = 1; i <= 100; i += 1) 1000 * i + 1000,
];
rasterTimes = rasterTimes.reversed.toList();
List<FrameTiming> inputData = <FrameTiming>[
for (int i = 0; i < 100; i += 1)
FrameTiming(<int>[0, buildTimes[i], 500, rasterTimes[i]]),
];
FrameTimingSummarizer summary = FrameTimingSummarizer(inputData);
expect(summary.averageFrameBuildTime.inMicroseconds, 50500);
expect(summary.p90FrameBuildTime.inMicroseconds, 90000);
expect(summary.p99FrameBuildTime.inMicroseconds, 99000);
expect(summary.worstFrameBuildTime.inMicroseconds, 100000);
expect(summary.missedFrameBuildBudget, 84);

expect(summary.averageFrameRasterizerTime.inMicroseconds, 51000);
expect(summary.p90FrameRasterizerTime.inMicroseconds, 90500);
expect(summary.p99FrameRasterizerTime.inMicroseconds, 99500);
expect(summary.worstFrameRasterizerTime.inMicroseconds, 100500);
expect(summary.missedFrameRasterizerBudget, 85);
expect(summary.frameBuildTime.length, 100);
});
}