Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 39 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,20 @@ TBD (PRs welcome!)
In the coordinator VM, check out this solr-bench repository.

1. `mvn clean compile assembly:single`
2. `./cleanup.sh && ./stress.sh -c <commit> <config-file>`
2. `./cleanup.sh && ./stress.sh -c <commit> -f <config-file>`

Example: `./cleanup.sh ./stress.sh -c dfde16a004206cc92e21cc5a6cad9030fbe13c20 suites/stress-facets-local.json`
Example: `./cleanup.sh ./stress.sh -c dfde16a004206cc92e21cc5a6cad9030fbe13c20 -f suites/stress-facets-local.json`

Usage: ./stress.sh -c <commit> [-g] [-v] -f <config-file>

| parameter | Required? | Description |
| ------- | ---------- | --------- |
| -f | Required | Configuration file for a suite |
| -c | Required | Commit point to run against |
| -g | Optional | Generate validations file(s) for all query benchmark tasks |
| -v | Optional | Perform validations based on specified validations file(s) |

Note: Only -g or -v can be specified at a time

#### Available tests
```
Expand All @@ -66,6 +76,33 @@ Note: This is subject to change

TBD

### Validations

User workflow:

1. Run their suite with `-g` (generate validation) flag. This will generate a file in the `suites/` dir containing a tuple `<query, numFound, facets>`.
2. Manually verify the generated file (`suites/validations-<testname>-docs-<docs>-queries-<numQueries>.json`).
3. The validations file can used for validations in subsequent runs.

#### Using validations with a generated file

a. Add a `"validation": "<file>"` parameter in the query-benchmark definition,
b. Run `stress.sh` with a `-v` (validate) flag. It will use the validations file in the query benchmark task and report number of successful and failed queries.

The results would be reported in the query benchmark task, for example (500 validations succeeded, 0 failed):

{
threads=1,
50th=8.2426775,
90th=18.409747399999993,
95th=28.618552849999993,
mean=16.281752914583333,
total-queries=480,
total-time=14085,
validations-succeeded=500,
validations-failed=0
}

### Visualization

To select/process the test results of specific branch or branch comparisons:
Expand Down
2 changes: 1 addition & 1 deletion archived-suites/multitenant.json
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@
"threadpool": "collection-api-threadpool",
"wait-for": "task1",
"mode": "async",
"validations": ["all-replicas-active"]
"validations": ["all-replicas-active"] // NOCOMMIT: reimplement
},
"task2c": {
"description": "MODIFYCOLLECTION at random",
Expand Down
4 changes: 2 additions & 2 deletions backtest.sh
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,8 @@ do echo; echo "Running $commit"
echo "Trying commit: $commit"
./cleanup.sh
./stress.sh -c $commit -v suites/$testnamefile
python3 graph-scripts/generate_graph_json.py -r suites/results/cluster-test -r suites/results/prs-vs-nonprs -r suites/results/stress-facets-local -b branch_9x...branch_9_1
cp graph/* /var/www/html
python3 graph-scripts/generate_graph_json.py -r suites/results/cluster-test -r suites/results/prs-vs-nonprs -r suites/results/stress-facets-local -b branch_9x...branch_9_1
cp graph/* /var/www/html
fi

done
Expand Down
253 changes: 162 additions & 91 deletions src/main/java/org/apache/solr/benchmarks/BenchmarksMain.java

Large diffs are not rendered by default.

23 changes: 23 additions & 0 deletions src/main/java/org/apache/solr/benchmarks/ControlledExecutor.java
Original file line number Diff line number Diff line change
@@ -1,13 +1,34 @@
package org.apache.solr.benchmarks;


import org.apache.commons.io.FileUtils;
import org.apache.commons.math3.stat.descriptive.SynchronizedDescriptiveStatistics;
import org.apache.solr.benchmarks.BenchmarksMain.QueryCallable;
import org.apache.solr.benchmarks.beans.QueryBenchmark;
import org.apache.solr.benchmarks.beans.SolrBenchQueryResponse;
import org.apache.solr.benchmarks.readers.TarGzFileReader;
import org.apache.solr.client.solrj.ResponseParser;
import org.apache.solr.client.solrj.SolrRequest.METHOD;
import org.apache.solr.client.solrj.impl.HttpSolrClient;
import org.apache.solr.client.solrj.impl.InputStreamResponseParser;
import org.apache.solr.client.solrj.request.QueryRequest;
import org.apache.solr.client.solrj.request.RequestWriter;
import org.apache.solr.common.params.CommonParams;
import org.apache.solr.common.params.MapSolrParams;
import org.apache.solr.common.params.SolrParams;
import org.apache.solr.common.util.NamedList;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import java.io.File;
import java.io.IOException;
import java.lang.invoke.MethodHandles;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.Random;
import java.util.Timer;
import java.util.TimerTask;
import java.util.concurrent.*;
Expand Down Expand Up @@ -90,6 +111,8 @@ public Thread newThread(Runnable r) {
backPressureLimiter = new BackPressureLimiter(threads * 10); //at most 10 * # of thread pending tasks
}



public void run() throws InterruptedException, ExecutionException {
startTime = System.currentTimeMillis();

Expand Down
19 changes: 12 additions & 7 deletions src/main/java/org/apache/solr/benchmarks/QueryGenerator.java
Original file line number Diff line number Diff line change
Expand Up @@ -8,19 +8,22 @@
import org.apache.solr.benchmarks.readers.TarGzFileReader;
import org.apache.solr.client.solrj.ResponseParser;
import org.apache.solr.client.solrj.SolrQuery;
import org.apache.solr.client.solrj.SolrRequest.METHOD;
import org.apache.solr.client.solrj.impl.InputStreamResponseParser;
import org.apache.solr.client.solrj.request.QueryRequest;
import org.apache.solr.client.solrj.request.RequestWriter;
import org.apache.solr.common.params.CommonParams;
import org.apache.solr.common.params.MapSolrParams;
import org.apache.solr.common.params.SolrParams;
import org.apache.solr.common.util.Pair;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.util.concurrent.atomic.AtomicLong;


public class QueryGenerator {
final QueryBenchmark queryBenchmark;
List<String> queries = new ArrayList<>();
Expand Down Expand Up @@ -54,7 +57,7 @@ public QueryGenerator(QueryBenchmark queryBenchmark) throws IOException, ParseEx
}


public QueryRequest nextRequest() {
public Pair<String, QueryRequest> nextRequest() {
while (counter.get() < queryBenchmark.offset) {
long idx = random == null ? counter.get() : random.nextInt(queries.size());
String q = queries.get((int) (idx % queries.size()));
Expand All @@ -68,9 +71,11 @@ public QueryRequest nextRequest() {

QueryRequest request;
if (queryBenchmark.templateValues != null && !queryBenchmark.templateValues.isEmpty()) {
PropertiesUtil.substituteProperty(q, queryBenchmark.templateValues);
q = PropertiesUtil.substituteProperty(q, queryBenchmark.templateValues);
}

String qString = q;

//TODO apply templates if any
if (Boolean.TRUE.equals(queryBenchmark.isJsonQuery)) {
request = new QueryRequest() {
Expand All @@ -81,7 +86,7 @@ public METHOD getMethod() {

@Override
public RequestWriter.ContentWriter getContentWriter(String expectedType) {
return new RequestWriter.StringPayloadContentWriter(q, CommonParams.JSON_MIME);
return new RequestWriter.StringPayloadContentWriter(qString, CommonParams.JSON_MIME);
}

@Override
Expand All @@ -96,7 +101,7 @@ public ResponseParser getResponseParser() {

@Override
public String toString() {
return q;
return qString;
}

@Override
Expand All @@ -111,14 +116,14 @@ public Map<String, String> getHeaders() {
};

} else {
request = new QueryRequest(Util.parseQueryString(q)) {
request = new QueryRequest(Util.parseQueryString(qString)) {
@Override
public String getCollection() {
return queryBenchmark.collection;
}
};
}

return request;
return new Pair(qString, request);
}
}
}
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
package org.apache.solr.benchmarks;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
Expand All @@ -21,10 +22,6 @@
import org.apache.commons.cli.ParseException;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.IOUtils;
import org.apache.solr.benchmarks.BenchmarksMain;
import org.apache.solr.benchmarks.MetricsCollector;
import org.apache.solr.benchmarks.Util;
import org.apache.solr.benchmarks.WorkflowResult;
import org.apache.solr.benchmarks.beans.Cluster;
import org.apache.solr.benchmarks.exporter.ExporterFactory;
import org.apache.solr.benchmarks.solrcloud.CreateWithAdditionalParameters;
Expand Down Expand Up @@ -64,11 +61,18 @@ private static CommandLine getCLIParams(String[] args) throws ParseException {
Options cliOptions = new Options();
cliOptions.addRequiredOption("f", "file", true, "Configuration file");
cliOptions.addRequiredOption("c", "commit", true, "Commit ID");
cliOptions.addOption("g", "generate-validations", false, "Generate validations data for all query tasks");
cliOptions.addOption("v", "validate", false, "Enable validations for all query tasks (not done unless specified)");

CommandLineParser cliParser = new DefaultParser();
CommandLine cli = cliParser.parse(cliOptions, args);
return cli;
}

public static boolean generateValidations = false;
public static boolean validate = false;


public static void main(String[] args) throws Exception {
CommandLine cliParams = getCLIParams(args);
String configFile = cliParams.getOptionValue("f");
Expand All @@ -79,6 +83,13 @@ public static void main(String[] args) throws Exception {
log.info("The base directory for the suite: " + SUITE_BASE_DIR);
System.setProperty("SUITE_BASE_DIRECTORY", SUITE_BASE_DIR);

generateValidations = cliParams.hasOption("g");
validate = cliParams.hasOption("v");
if (generateValidations && validate) {
log.error("Cannot use -g (--generate-validations) and -v (--validate) at the same time.");
System.exit(1);
}

Workflow workflow = new ObjectMapper().readValue(FileUtils.readFileToString(new File(configFile), "UTF-8"), Workflow.class);
Cluster cluster = workflow.cluster;

Expand Down Expand Up @@ -678,9 +689,6 @@ private static Callable taskCallable(Workflow workflow, SolrCloud cloud, Map<Str
}
long end = System.currentTimeMillis();

runValidations(instance.validations, workflow, cloud);


return end-start;
};
return c;
Expand Down Expand Up @@ -721,26 +729,6 @@ private static int getNumInactiveReplicas(SolrNode node, CloudSolrClient client)
return numInactive;
}

public static void runValidations(List<String> validations, Workflow workflow, SolrCloud cloud) {
if (validations == null) return;
for (String v: validations) {
Workflow.Validation validationDefinition = workflow.validations.get(v);
if (validationDefinition.numInactiveReplicas != null) {
// get num inactive replicas
try (CloudSolrClient client = buildSolrClient(cloud);) {
int numInactive = getNumInactiveReplicas(null, client);
log.info("Validation: inactive replicas are " + numInactive);
if (numInactive > validationDefinition.numInactiveReplicas) {
log.error("Failed validation: " + new ObjectMapper().writeValueAsString(validationDefinition));
}
} catch (KeeperException | InterruptedException | IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}

/**
* Populate some random variables like RANDOM_COLLECTION or RANDOM_SHARD
*/
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
package org.apache.solr.benchmarks;
import java.util.List;
import java.util.Map;

Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
package org.apache.solr.benchmarks;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
package org.apache.solr.benchmarks;
import java.util.Collections;
import java.util.List;
import java.util.Map;
Expand Down Expand Up @@ -37,10 +38,6 @@ public class Workflow {
@JsonProperty("execution-plan")
Map<String, TaskInstance> executionPlan;

// Validations definition
@JsonProperty("validations")
Map<String, Validation> validations;

@JsonProperty("metrics")
public List<String> metrics;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
import java.util.HashMap;
import java.util.Map;

import org.apache.solr.benchmarks.validations.Validations;

public class QueryBenchmark extends BaseBenchmark {
@JsonProperty("collection")
public String collection;
Expand Down Expand Up @@ -58,4 +60,8 @@ public class QueryBenchmark extends BaseBenchmark {
*/
@JsonProperty("detailed-stats")
public boolean detailedStats = false;

@JsonProperty("validations")
public Validations validations;

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
package org.apache.solr.benchmarks.beans;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.lang.invoke.MethodHandles;
import java.util.Map;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.concurrent.atomic.AtomicReference;

import org.apache.commons.io.IOUtils;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.solr.client.solrj.request.QueryRequest;
import org.apache.solr.common.util.NamedList;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

// nocommit: javadocs
public class SolrBenchQueryResponse {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice to have this class to avoid double reading the stream that triggers errors 💪🏼


private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());

public String queryString;

public String responseString = null;

public NamedList<Object> rawResponse;

public boolean isSuccessfulResponse = false;

public SolrBenchQueryResponse (String queryString, NamedList<Object> response) {
this.rawResponse = response;
this.queryString = queryString;

InputStream responseStream = (InputStream) response.get("stream");
try {
responseString = getResponseStreamAsString(responseStream); // should only call this once, as this reads the stream!
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to make reading the stream lazy? My concern is that this always read the response stream, which seems unnecessary if the caller does not care about detailed stats nor needing any validations (which i assume is a common case for general benchmarking?)

} catch (IOException e) {
log.warn("Failed to read the response stream for " + queryString);
}

isSuccessfulResponse = isSuccessfulRsp(response.get("closeableResponse"));
}

private String getResponseStreamAsString(InputStream responseStream) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
IOUtils.copy(responseStream, baos);

return new String(baos.toByteArray());
}

private boolean isSuccessfulRsp(Object responseObj) {
if (responseObj instanceof CloseableHttpResponse) {
int statusCode = ((CloseableHttpResponse) responseObj).getStatusLine().getStatusCode();
if (statusCode == 200) {
return true;
} else {
return false;
}
} else {
return false;
}
}


}
Loading