Merge lp:~josharenson/mir/install_glmark2 into lp:mir
- install_glmark2
- Merge into development-branch
Status: | Superseded |
---|---|
Proposed branch: | lp:~josharenson/mir/install_glmark2 |
Merge into: | lp:mir |
Diff against target: |
181 lines (+143/-0) 4 files modified
debian/mir-test-tools.install (+1/-0) tests/CMakeLists.txt (+5/-0) tests/performance-tests/CMakeLists.txt (+26/-0) tests/performance-tests/test_glmark2-es2-mir.cpp (+111/-0) |
To merge this branch: | bzr merge lp:~josharenson/mir/install_glmark2 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
PS Jenkins bot (community) | continuous-integration | Needs Fixing | |
Alberto Aguirre (community) | Needs Fixing | ||
Alan Griffiths | Needs Fixing | ||
Alexandros Frantzis (community) | Needs Fixing | ||
Daniel van Vugt | Abstain | ||
Robert Carr (community) | Needs Information | ||
Review via email: mp+216504@code.launchpad.net |
This proposal has been superseded by a proposal from 2014-04-30.
Commit message
Creates a performance test that runs glmak2 and saves the score to a json file.
Description of the change
Josh Arenson (josharenson) wrote : | # |
> We don't need to link mir_performance
Absolutely.
Alan Griffiths (alan-griffiths) wrote : | # |
Where should we source glmark2-es2-mir?
$ apt-cache search glmark2-es2-mir
$
Josh Arenson (josharenson) wrote : | # |
> Where should we source glmark2-es2-mir?
>
> $ apt-cache search glmark2-es2-mir
> $
glmark2-es2-mir (the version that actually runs) is currently located in ppa:mir-
Robert Carr (robertcarr) wrote : | # |
Why doesnt jenkins even say hello to this branch?
Josh Arenson (josharenson) wrote : | # |
> Why doesnt jenkins even say hello to this branch?
According to fginther, there is an outage.
PS Jenkins bot (ps-jenkins) wrote : | # |
FAILED: Continuous integration, rev:1572
No commit message was specified in the merge proposal. Click on the following link and set the commit message (if you want a jenkins rebuild you need to trigger it yourself):
https:/
http://
Executed test runs:
FAILURE: http://
SUCCESS: http://
SUCCESS: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
SUCCESS: http://
Click here to trigger a rebuild:
http://
Alexandros Frantzis (afrantzis) wrote : | # |
46 +find_program(
47 +
48 +if(NOT GLMARK2_
49 + message("warning: glmark2-es2-mir not found, performance tests _may_ fail.")
50 +endif()
Since we are not running the program automatically when building, I am not sure how useful it is to print this message at build time.
77 +#include <stdlib.h>
We also need <stdio.h> for popen, getline etc (we shouldn't depend on other headers bringing it in).
Some style issues (see http://
85 +class GLMark2Test : public ::testing::Test
86 +{
...
Mir uses 4 space indentations.
88 + enum ResultFileType {RAW, JSON};
Lowercase identifiers for enum values.
103 + while((read = getline(&line, &len, in)) != -1)
105 + if(file_type == RAW)
107 + else if(file_type == JSON)
109 + if(boost:
Spaces between keyword (if,while) and opening parenthesis.
105 + if(file_type == RAW)
106 + glmark2_output << line;
Braces around statement, since the 'else' clause also has braces.
94 + const char *cmd
93 + FILE *in;
...
'*' goes with type in our C++ code and const goes after the type. So: "FILE* in" and "char const* cmd"
98 + std::ofstream glmark2_output;
No need to define all variables in the beginning of the code block if it is more convenient (or necessary for RAII purposes) not to do so. So, for example it is allowed to have "std::ofstream glmark2_
123 + glmark2_
No need to manually close the file, it's closed on std::ofstream destruction.
Daniel van Vugt (vanvugt) wrote : | # |
I have no opinion, but Jenkins won't ever land this till you add a commit message up the top.
Josh Arenson (josharenson) wrote : | # |
> I have no opinion, but Jenkins won't ever land this till you add a commit
> message up the top.
Thanks for the heads up. I'm working on a refactor and will add the message when done.
Alberto Aguirre (albaguirre) wrote : | # |
> > I have no opinion, but Jenkins won't ever land this till you add a commit
> > message up the top.
> Thanks for the heads up. I'm working on a refactor and will add the message
> when done.
You should mark this as "work in progress" so that others don't try to review code that is about to change.
Daniel van Vugt (vanvugt) : | # |
PS Jenkins bot (ps-jenkins) wrote : | # |
FAILED: Continuous integration, rev:1577
No commit message was specified in the merge proposal. Click on the following link and set the commit message (if you want a jenkins rebuild you need to trigger it yourself):
https:/
http://
Executed test runs:
SUCCESS: http://
SUCCESS: http://
SUCCESS: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
SUCCESS: http://
Click here to trigger a rebuild:
http://
Alexandros Frantzis (afrantzis) wrote : | # |
119 + /*HACK to wait for the server to start*/
You can override the mir::DefaultSer
/*HACK to stop the mir server*/
You can get the main loop object from mir::DefaultSer
Alan Griffiths (alan-griffiths) wrote : | # |
> 119 + /*HACK to wait for the server to start*/
>
> You can override the
> mir::DefaultSer
> own implementation to wait for server start.
>
> /*HACK to stop the mir server*/
>
> You can get the main loop object from
> mir::DefaultSer
Even simpler: just use the pre-existing InProcessServer test fixture to start the server before entering the test body and close it afterwards.
Alexandros Frantzis (afrantzis) wrote : | # |
> > 119 + /*HACK to wait for the server to start*/
> >
> > You can override the
> > mir::DefaultSer
> you
> > own implementation to wait for server start.
> >
> > /*HACK to stop the mir server*/
> >
> > You can get the main loop object from
> > mir::DefaultSer
>
> Even simpler: just use the pre-existing InProcessServer test fixture to start
> the server before entering the test body and close it afterwards.
Even better, +1
Alberto Aguirre (albaguirre) wrote : | # |
~~~
111 + FILE* in;
112 + char* line = NULL;
113 + size_t len = 0;
114 + ssize_t read;
123 + ASSERT_TRUE((in = popen(cmd, "r")));
126 + while ((read = getline(&line, &len, in)) != -1)
151 + free(line);
152 + fclose(in);
~~~
I'd like to see that in a separate class and put under mir_test as it could be useful if we need to run other
benchmark programs OR use pstreams library which already does that for you.
~~~
virtual void RunGLMark2(char const* output_filename, ResultFileType file_type)
~~~
RunGLMark2 not in the style described at http://
~~~
106 + enum ResultFileType {raw, json};
~~~
Use enum class.
Alberto Aguirre (albaguirre) wrote : | # |
Maybe this will be useful:
https:/
Popen p{"glmark2-es2-mir --fullscreen"};
std::string line;
while(p.
{
<do something with line>
}
PS Jenkins bot (ps-jenkins) wrote : | # |
FAILED: Continuous integration, rev:1579
http://
Executed test runs:
SUCCESS: http://
SUCCESS: http://
FAILURE: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
SUCCESS: http://
deb: http://
FAILURE: http://
FAILURE: http://
Click here to trigger a rebuild:
http://
Preview Diff
1 | === modified file 'debian/mir-test-tools.install' |
2 | --- debian/mir-test-tools.install 2014-03-06 06:05:17 +0000 |
3 | +++ debian/mir-test-tools.install 2014-04-30 23:22:39 +0000 |
4 | @@ -2,5 +2,6 @@ |
5 | usr/bin/mir_unit_tests |
6 | usr/bin/mir_acceptance_tests |
7 | usr/bin/mir_integration_tests |
8 | +usr/bin/mir_performance_tests |
9 | usr/lib/*/mir/tools/libmirclientlttng.so |
10 | usr/lib/*/mir/tools/libmirserverlttng.so |
11 | |
12 | === modified file 'tests/CMakeLists.txt' |
13 | --- tests/CMakeLists.txt 2014-03-26 05:48:59 +0000 |
14 | +++ tests/CMakeLists.txt 2014-04-30 23:22:39 +0000 |
15 | @@ -26,6 +26,7 @@ |
16 | |
17 | option(MIR_BUILD_ACCEPTANCE_TESTS "Build acceptance tests" ON) |
18 | option(MIR_BUILD_INTEGRATION_TESTS "Build integration tests" ON) |
19 | +option(MIR_BUILD_PERFORMANCE_TESTS "Build performance tests" ON) |
20 | option(MIR_BUILD_UNIT_TESTS "Build unit tests" ON) |
21 | |
22 | if (MIR_BUILD_ACCEPTANCE_TESTS) |
23 | @@ -36,6 +37,10 @@ |
24 | add_subdirectory(integration-tests/) |
25 | endif (MIR_BUILD_INTEGRATION_TESTS) |
26 | |
27 | +if (MIR_BUILD_PERFORMANCE_TESTS) |
28 | + add_subdirectory(performance-tests/) |
29 | +endif(MIR_BUILD_PERFORMANCE_TESTS) |
30 | + |
31 | if (MIR_BUILD_UNIT_TESTS) |
32 | add_subdirectory(unit-tests/) |
33 | endif (MIR_BUILD_UNIT_TESTS) |
34 | |
35 | === added directory 'tests/performance-tests' |
36 | === added file 'tests/performance-tests/CMakeLists.txt' |
37 | --- tests/performance-tests/CMakeLists.txt 1970-01-01 00:00:00 +0000 |
38 | +++ tests/performance-tests/CMakeLists.txt 2014-04-30 23:22:39 +0000 |
39 | @@ -0,0 +1,26 @@ |
40 | +set( |
41 | + PERFORMANCE_TESTS_SOURCES |
42 | + |
43 | + test_glmark2-es2-mir.cpp |
44 | +) |
45 | + |
46 | +add_executable( |
47 | + mir_performance_tests |
48 | + |
49 | + ${PERFORMANCE_TESTS_SOURCES} |
50 | +) |
51 | + |
52 | +target_link_libraries( |
53 | + mir_performance_tests |
54 | + mirserver |
55 | + |
56 | + ${Boost_LIBRARIES} |
57 | + ${GTEST_BOTH_LIBRARIES} |
58 | + ${GMOCK_LIBRARY} |
59 | + ${GMOCK_MAIN_LIBRARY} |
60 | +) |
61 | + |
62 | +install( |
63 | + TARGETS mir_performance_tests |
64 | + RUNTIME DESTINATION ${CMAKE_INSTALL_BINDIR} |
65 | +) |
66 | |
67 | === added file 'tests/performance-tests/test_glmark2-es2-mir.cpp' |
68 | --- tests/performance-tests/test_glmark2-es2-mir.cpp 1970-01-01 00:00:00 +0000 |
69 | +++ tests/performance-tests/test_glmark2-es2-mir.cpp 2014-04-30 23:22:39 +0000 |
70 | @@ -0,0 +1,111 @@ |
71 | +#include "mir/default_server_configuration.h" |
72 | +#include "mir/display_server.h" |
73 | +#include "mir/run_mir.h" |
74 | + |
75 | +#include <gtest/gtest.h> |
76 | + |
77 | +#include <stdlib.h> |
78 | +#include <stdio.h> |
79 | + |
80 | +#include <boost/date_time/posix_time/posix_time.hpp> |
81 | +#include <boost/regex.hpp> |
82 | +#include <boost/thread/thread.hpp> |
83 | +#include <condition_variable> |
84 | +#include <iostream> |
85 | +#include <mutex> |
86 | +#include <fstream> |
87 | +#include <string> |
88 | +#include <thread> |
89 | + |
90 | +class GLMark2Test : public ::testing::Test |
91 | +{ |
92 | +protected: |
93 | + enum ResultFileType {raw, json}; |
94 | + mir::DisplayServer* result{nullptr}; |
95 | + std::thread server_thread; |
96 | + |
97 | + mir::DisplayServer* launch_mir_server() |
98 | + { |
99 | + char const* argv[] = {""}; |
100 | + int argc = sizeof(argv) / sizeof(argv[0]); |
101 | + mir::DefaultServerConfiguration config(argc, argv); |
102 | + |
103 | + std::condition_variable cv; |
104 | + std::mutex mutex; |
105 | + server_thread = std::thread([&] |
106 | + { |
107 | + try |
108 | + { |
109 | + run_mir(config, [&](mir::DisplayServer& ds) { |
110 | + std::unique_lock<std::mutex> lock(mutex); |
111 | + result = &ds; |
112 | + cv.notify_one(); |
113 | + }); |
114 | + } |
115 | + catch(...) |
116 | + { |
117 | + mir::report_exception(std::cerr); |
118 | + } |
119 | + }); |
120 | + |
121 | + using namespace std::chrono; |
122 | + auto const time_limit = system_clock::now() + seconds(2); |
123 | + std::unique_lock<std::mutex> lock(mutex); |
124 | + while(!result) |
125 | + { |
126 | + cv.wait_until(lock, time_limit); |
127 | + } |
128 | + |
129 | + return result; |
130 | + } |
131 | + |
132 | + virtual void RunGLMark2(char const* output_filename, ResultFileType file_type) |
133 | + { |
134 | + mir::DisplayServer* ds = launch_mir_server(); |
135 | + ASSERT_TRUE(ds); |
136 | + |
137 | + boost::cmatch matches; |
138 | + boost::regex re_glmark2_score(".*glmark2\\s+Score:\\s+(\\d+).*"); |
139 | + FILE* in; |
140 | + char* line = NULL; |
141 | + size_t len = 0; |
142 | + ssize_t read; |
143 | + std::ofstream glmark2_output; |
144 | + glmark2_output.open(output_filename); |
145 | + |
146 | + char const* cmd = "glmark2-es2-mir -b texture --fullscreen"; |
147 | + ASSERT_TRUE((in = popen(cmd, "r"))); |
148 | + |
149 | + while ((read = getline(&line, &len, in)) != -1) |
150 | + { |
151 | + if (file_type == raw) |
152 | + { |
153 | + glmark2_output << line; |
154 | + } |
155 | + else if (file_type == json) |
156 | + { |
157 | + std::cout << line; |
158 | + if (boost::regex_match(line, matches, re_glmark2_score)) |
159 | + { |
160 | + std::string json = "{"; |
161 | + json += "'benchmark_name':'glmark2-es2-mir'"; |
162 | + json += ","; |
163 | + json += "'score':'" + matches[1] + "'"; |
164 | + json += "}"; |
165 | + glmark2_output << json; |
166 | + break; |
167 | + } |
168 | + } |
169 | + } |
170 | + |
171 | + ds->stop(); |
172 | + if (server_thread.joinable()) server_thread.join(); |
173 | + free(line); |
174 | + fclose(in); |
175 | + } |
176 | +}; |
177 | + |
178 | +TEST_F(GLMark2Test, benchmark_fullscreen_default) |
179 | +{ |
180 | + RunGLMark2("glmark2_fullscreen_default.json", json); |
181 | +} |
We don't need to link mir_performance _test against so much right?