Skip to end of metadata
Go to start of metadata

Plugin Information

View Performance on the plugin site for more information.

This plugin allows you to capture reports from popular testing tools. Jenkins will generate graphic charts with the trend report of performance and robustness.
It includes the feature of setting the final build status as good, unstable or failed, based on the reported error percentage.
Report formats supported:



  1. Unknown User (ironman)

    JUnit reports does not work :(

    I've set JUnit properly in hudson, but it produces wrong results. All tests have min set to 9223372036854775807 and max to -9223372036854775808.

    I've noticed this logs in hudson console:

    Performance: Parsing JMeter report file TEST-hudson.test.ATest.xml
    Performance: Parsing JMeter report file TESTS-TestSuites.xml

    It tries to parse JUnit test output as JMeter.

    1. - Do you have selected correctly the parser?

      - Could you send me these two files?

  2. Unknown User (

    Has anyone else had experience using the performance plugin with JUnit XML reporting as it claims to work with?

    I read that the output report has to be from SOAPU...

    Can someone please post a working example file please!


    1. What do you want? a soapui example or just a junit xml file?

      I have not used soapui, I just have tested the feature parsing the output files generated by maven (surefire) when running the tests.

  3. Unknown User (kblearner)

    I installed this plugin, configured on Hudson and am able to get the Trend and Performance Reports successfully.  However, there seems to be a bug with the reporting structure.  In my JMeter test plan, I've a CSV Data Set Config (external CSV or TXT file) from where I read the values into my test plan.   I see that on doing an execution, the plugin reports in Hudson show the timings for CSV config element too.  This screws up the entire calculation of  Max , Min and Avg for all transactions put together.  Ideally,  only the response times for transaction controller's should be shown and NOT for Config element's. (Attached a sample view for reference)

    If any body has faced this issue OR  has any suggestions/comment on this, would appreciate If you throw some light on it.


  4. Unknown User (

    We also have problems creating Reports with JUnit...min is always 9223372036854775807, max -9223372036854775808.
    Does not seem to work :-(

    We use Hudson 1.336 and Performance Plugin 1.3, Maven and JUnit 3.8

    The xml File looks like this:

    <?xml version="1.0" encoding="UTF-8" ?>
    <testsuite failures="0" time="1.108" errors="0" skipped="0" tests="2" name="com.jaxlion.base.LogPathInfoTest">
    <property name="" value="Java(TM) SE Runtime Environment"/>
    <property name="sun.boot.library.path" value="/usr/java/jdk1.6.0_16/jre/lib/amd64"/>
    <property name="java.vm.version" value="14.2-b01"/>
    <property name="java.vm.vendor" value="Sun Microsystems Inc."/>
    <property name="java.vendor.url" value=""/>
    <property name="path.separator" value=":"/>
    <property name="" value="Java HotSpot(TM) 64-Bit Server VM"/>
    <property name="file.encoding.pkg" value=""/>
    <property name="" value="US"/>
    <property name="" value="SUN_STANDARD"/>
    <property name="sun.os.patch.level" value="unknown"/>
    <property name="" value="Java Virtual Machine Specification"/>
    <property name="user.dir" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk"/>
    <property name="jaxlion.started" value="Wed Aug 25 07:35:00 CEST 2010"/>
    <property name="java.runtime.version" value="1.6.0_16-b01"/>
    <property name="java.awt.graphicsenv" value="sun.awt.X11GraphicsEnvironment"/>
    <property name="basedir" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk"/>
    <property name="java.endorsed.dirs" value="/usr/java/jdk1.6.0_16/jre/lib/endorsed"/>
    <property name="os.arch" value="amd64"/>
    <property name="surefire.real.class.path" value="/tmp/surefirebooter5448448631377469438.jar"/>
    <property name="" value="/tmp"/>
    <property name="line.separator" value="
    <property name="java.vm.specification.vendor" value="Sun Microsystems Inc."/>
    <property name="" value="Linux"/>
    <property name="sun.jnu.encoding" value="UTF-8"/>
    <property name="java.library.path" value="/usr/java/jdk1.6.0_16/jre/lib/amd64/server:/usr/java/jdk1.6.0_16/jre/lib/amd64:/usr/java/jdk1.6.0_16/jre/../lib/amd64:/usr/java/packages/lib/amd64:/lib:/usr/lib"/>
    <property name="javax.xml.parsers.SAXParserFactory" value="org.apache.xerces.jaxp.SAXParserFactoryImpl"/>
    <property name="surefire.test.class.path" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/test-classes:/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/generated-classes/cobertura:/opt/tomcat/temp/.m2/repository/ch/loewenfels/loepa-commons/1.0.15/loepa-commons-1.0.15.jar:/opt/tomcat/temp/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar:/opt/tomcat/temp/.m2/repository/junit/junit/3.8.2/junit-3.8.2.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/slf4j-api/1.5.11/slf4j-api-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jul-to-slf4j/1.5.11/jul-to-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jcl-over-slf4j/1.5.11/jcl-over-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-classic/0.9.20/logback-classic-0.9.20.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-core/0.9.20/logback-core-0.9.20.jar:/opt/tomcat/temp/.m2/repository/javax/mail/mail/1.4/mail-1.4.jar:/opt/tomcat/temp/.m2/repository/commons-io/commons-io/1.4/commons-io-1.4.jar:/opt/tomcat/temp/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/opt/tomcat/temp/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/opt/tomcat/temp/.m2/repository/xalan/xalan/2.7.1/xalan-2.7.1.jar:/opt/tomcat/temp/.m2/repository/xalan/serializer/2.7.1/serializer-2.7.1.jar:/opt/tomcat/temp/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/opt/tomcat/temp/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/opt/tomcat/temp/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/opt/tomcat/temp/.m2/repository/tar/tar/2.3/tar-2.3.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-context/2.5.6/spring-context-2.5.6.jar:/opt/tomcat/temp/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-beans/2.5.6/spring-beans-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-core/2.5.6/spring-core-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-aop/2.5.6/spring-aop-2.5.6.jar:/opt/tomcat/temp/.m2/repository/cglib/cglib-nodep/2.2/cglib-nodep-2.2.jar:/opt/tomcat/temp/.m2/repository/org/mockito/mockito-all/1.8.4/mockito-all-1.8.4.jar:/opt/tomcat/temp/.m2/repository/net/sourceforge/cobertura/cobertura/1.9.2/cobertura-1.9.2.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant/1.7.0/ant-1.7.0.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant-launcher/1.7.0/ant-launcher-1.7.0.jar:"/>
    <property name="" value="Java Platform API Specification"/>
    <property name="java.class.version" value="50.0"/>
    <property name="" value="HotSpot 64-Bit Server Compiler"/>
    <property name="os.version" value=""/>
    <property name="user.home" value="/opt/tomcat/temp"/>
    <property name="user.timezone" value="Europe/Zurich"/>
    <property name="java.awt.printerjob" value="sun.print.PSPrinterJob"/>
    <property name="java.specification.version" value="1.6"/>
    <property name="file.encoding" value="UTF-8"/>
    <property name="javax.xml.transform.TransformerFactory" value="org.apache.xalan.processor.TransformerFactoryImpl"/>
    <property name="" value="tomcat"/>
    <property name="java.class.path" value="/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/test-classes:/opt/build/hudson/jobs/jaxlion-core/workspace/trunk/target/generated-classes/cobertura:/opt/tomcat/temp/.m2/repository/ch/loewenfels/loepa-commons/1.0.15/loepa-commons-1.0.15.jar:/opt/tomcat/temp/.m2/repository/commons-lang/commons-lang/2.4/commons-lang-2.4.jar:/opt/tomcat/temp/.m2/repository/junit/junit/3.8.2/junit-3.8.2.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/slf4j-api/1.5.11/slf4j-api-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jul-to-slf4j/1.5.11/jul-to-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/org/slf4j/jcl-over-slf4j/1.5.11/jcl-over-slf4j-1.5.11.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-classic/0.9.20/logback-classic-0.9.20.jar:/opt/tomcat/temp/.m2/repository/ch/qos/logback/logback-core/0.9.20/logback-core-0.9.20.jar:/opt/tomcat/temp/.m2/repository/javax/mail/mail/1.4/mail-1.4.jar:/opt/tomcat/temp/.m2/repository/commons-io/commons-io/1.4/commons-io-1.4.jar:/opt/tomcat/temp/.m2/repository/xerces/xercesImpl/2.9.1/xercesImpl-2.9.1.jar:/opt/tomcat/temp/.m2/repository/xml-apis/xml-apis/1.3.04/xml-apis-1.3.04.jar:/opt/tomcat/temp/.m2/repository/xalan/xalan/2.7.1/xalan-2.7.1.jar:/opt/tomcat/temp/.m2/repository/xalan/serializer/2.7.1/serializer-2.7.1.jar:/opt/tomcat/temp/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/opt/tomcat/temp/.m2/repository/commons-digester/commons-digester/1.8/commons-digester-1.8.jar:/opt/tomcat/temp/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar:/opt/tomcat/temp/.m2/repository/tar/tar/2.3/tar-2.3.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-context/2.5.6/spring-context-2.5.6.jar:/opt/tomcat/temp/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-beans/2.5.6/spring-beans-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-core/2.5.6/spring-core-2.5.6.jar:/opt/tomcat/temp/.m2/repository/org/springframework/spring-aop/2.5.6/spring-aop-2.5.6.jar:/opt/tomcat/temp/.m2/repository/cglib/cglib-nodep/2.2/cglib-nodep-2.2.jar:/opt/tomcat/temp/.m2/repository/org/mockito/mockito-all/1.8.4/mockito-all-1.8.4.jar:/opt/tomcat/temp/.m2/repository/net/sourceforge/cobertura/cobertura/1.9.2/cobertura-1.9.2.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant/1.7.0/ant-1.7.0.jar:/opt/tomcat/temp/.m2/repository/org/apache/ant/ant-launcher/1.7.0/ant-launcher-1.7.0.jar:"/>
    <property name="java.vm.specification.version" value="1.0"/>
    <property name="" value="64"/>
    <property name="java.home" value="/usr/java/jdk1.6.0_16/jre"/>
    <property name="java.specification.vendor" value="Sun Microsystems Inc."/>
    <property name="user.language" value="en"/>
    <property name="" value="mixed mode"/>
    <property name="java.version" value="1.6.0_16"/>
    <property name="java.ext.dirs" value="/usr/java/jdk1.6.0_16/jre/lib/ext:/usr/java/packages/lib/ext"/>
    <property name="sun.boot.class.path" value="/usr/java/jdk1.6.0_16/jre/lib/resources.jar:/usr/java/jdk1.6.0_16/jre/lib/rt.jar:/usr/java/jdk1.6.0_16/jre/lib/sunrsasign.jar:/usr/java/jdk1.6.0_16/jre/lib/jsse.jar:/usr/java/jdk1.6.0_16/jre/lib/jce.jar:/usr/java/jdk1.6.0_16/jre/lib/charsets.jar:/usr/java/jdk1.6.0_16/jre/classes"/>
    <property name="java.vendor" value="Sun Microsystems Inc."/>
    <property name="javax.xml.parsers.DocumentBuilderFactory" value="org.apache.xerces.jaxp.DocumentBuilderFactoryImpl"/>
    <property name="localRepository" value="/opt/tomcat/temp/.m2/repository"/>
    <property name="file.separator" value="/"/>
    <property name="java.vendor.url.bug" value=""/>
    <property name="sun.cpu.endian" value="little"/>
    <property name="" value="UnicodeLittle"/>
    <property name="sun.cpu.isalist" value=""/>
    <testcase time="1.097" classname="com.jaxlion.base.LogPathInfoTest" name="testGetServiceLogPath"/>
    <testcase time="0" classname="com.jaxlion.base.LogPathInfoTest" name="testGetSystemLogPath"/>

    1. Unknown User (

      I also get the same results creating Reports with JUnit:  min is always 9223372036854775807, max -9223372036854775808.

      I thought it might be a JUnit version issue, but we are using Junit 4.5, and Hudson 1.375 and Performance Plugin 1.3.

      Doesn't anyone know how to get it working right?  Bueller?  Bueller?

      1. Unknown User (

        I dug a little more, and found this (by clicking on the Help icon in the Configure page for a job for which I installed the Performance plugin): 

        This plugin understands the JMeter analysis report XML format and the SOAPUI report in JUnit format.  (links to )
        This plug-in does not perform the actual analysis; it only displays useful information about analysis results, such as average responding time, historical result trend, web UI for viewing analysis reports, and so on.

    2. Unknown User (

      I dug a little more using Performance Plugin with JUnit Reports... I downloaded the source and enhanced the PerformanceReportTest class with an additional test using an own JUnit-reportfile. The JUnitParser workes fine in this unit-test.

      But on hudson, it doesn't work :-(. The plug-in finds the test-files, but the report and trend are wrong (min is always 9223372036854775807, max -922337203685477580).

      Any idea?

  5. Unknown User (

    Well I am having tough time having to read JMeter report for hudson.

    If I specify the absolute path to my JMeter report file - "C:/SelNG/jmeter2/target/jmeter-reports/GoogleAdvanceSearch-100906.xml" I get to see performance report. But I cannot use absolute path as the report "GoogleAdvanceSearch-100906.xml" contains time stamp with it. So I tried to use regular expression as illustrated above "**/*.xml" Now when I build, I encounter following exception -

    *****************************Performance: Recording JMeter reports '*/.xml'
    Performance: no JMeter files matching '*/.xml' have been found. Has the report generated?. Setting Build to FAILURE
    Finished: FAILURE

    Is there any thing I am missing here?

    Thanks in advance
    Tarun K

    1. Unknown User (kblearner)

      Hi Tarun,
      Is it the "JMeter" report or "JUnit" report you'r trying to read.
      If it's JMeter report, you've to specify - "**/*.jtl". This works perfectly fine.


      1. Unknown User (

        Actually it is xml which is generated out of jmeter maven plugin. Hence I specified path as - "*/**.xml"

        but I always encounter exception -

        ****************************Performance: Recording JMeter reports '*/*.xml'
        Performance: no JMeter files matching '*/.xml' have been found. Has the report generated?. Setting Build to FAILURE
        Finished: FAILURE

        1. replace '*/.xml' by *'*/.xml'

  6. Unknown User (

    Great plugin - thanks!

    Just one question: it seems like the trend graphs don't appear while a build is progress. Can anyone else confirm this?

    Our test suites tend to take a long time to run and it's kind of a pain to have them not visible during those times.

    Is there anything I can do?


    John Wood

    1. I saw the same problem. So i added the graphs to the Performance Reports of each build.

      I changed the file to this:
      package hudson.plugins.performance;
      import hudson.model.AbstractBuild;
      import hudson.model.ModelObject;
      import java.util.ArrayList;
      import java.util.Arrays;
      import java.util.Collection;
      import java.util.Collections;
      import java.util.LinkedHashMap;
      import java.util.List;
      import java.util.Map;
      import java.util.StringTokenizer;
      import hudson.model.TaskListener;
      import hudson.util.ChartUtil;
      import hudson.util.ChartUtil.NumberOnlyBuildLabel;
      import hudson.util.DataSetBuilder;
      import org.kohsuke.stapler.StaplerRequest;
      import org.kohsuke.stapler.StaplerResponse;
       * Root object of a performance report.
      public class PerformanceReportMap implements ModelObject {
           * The {@link PerformanceBuildAction} that this report belongs to.
          private transient PerformanceBuildAction buildAction;
           * {@link PerformanceReport}s are keyed by {@link PerformanceReport#reportFileName}
           * Test names are arbitrary human-readable and URL-safe string that identifies an individual report.
          private Map<String, PerformanceReport> performanceReportMap = new LinkedHashMap<String, PerformanceReport>();
          private static final String PERFORMANCE_REPORTS_DIRECTORY = "performance-reports";
           * Parses the reports and build a {@link PerformanceReportMap}.
           * @throws IOException
           *      If a report fails to parse.
          PerformanceReportMap(final PerformanceBuildAction buildAction, TaskListener listener)
                  throws IOException {
              this.buildAction = buildAction;
              parseReports(getBuild(), listener, new PerformanceReportCollector() {
                  public void addAll(Collection<PerformanceReport> reports) {
                      for (PerformanceReport r : reports) {
                          performanceReportMap.put(r.getReportFileName(), r);
              }, null);
          private void addAll(Collection<PerformanceReport> reports) {
              for (PerformanceReport r : reports) {
                  performanceReportMap.put(r.getReportFileName(), r);
          public AbstractBuild<?, ?> getBuild() {
              return buildAction.getBuild();
          PerformanceBuildAction getBuildAction() {
              return buildAction;
          public String getDisplayName() {
              return Messages.Report_DisplayName();
          public List<PerformanceReport> getPerformanceListOrdered() {
              List<PerformanceReport> listPerformance = new ArrayList<PerformanceReport>(
              return listPerformance;
          public Map<String, PerformanceReport> getPerformanceReportMap() {
              return performanceReportMap;
           * <p>
           * Give the Performance report with the parameter for name in Bean
           * </p>
           * @param performanceReportName
           * @return
          public PerformanceReport getPerformanceReport(String performanceReportName) {
              return performanceReportMap.get(performanceReportName);
           * Get a URI report within a Performance report file
           * @param uriReport
           *            "Performance report file name";"URI name"
           * @return
          public UriReport getUriReport(String uriReport) {
              if (uriReport != null) {
                  String uriReportDecoded;
                  try {
                      uriReportDecoded = URLDecoder.decode(uriReport.replace(
                              UriReport.END_PERFORMANCE_PARAMETER, ""), "UTF-8");
                  } catch (UnsupportedEncodingException e) {
                      return null;
                  StringTokenizer st = new StringTokenizer(uriReportDecoded,
                  return getPerformanceReportMap().get(st.nextToken()).getUriReportMap().get(
              } else {
                  return null;
          public String getUrlName() {
              return "performanceReportList";
          void setBuildAction(PerformanceBuildAction buildAction) {
              this.buildAction = buildAction;
          public void setPerformanceReportMap(
                  Map<String, PerformanceReport> performanceReportMap) {
              this.performanceReportMap = performanceReportMap;
          public static String getPerformanceReportFileRelativePath(
                  String parserDisplayName, String reportFileName) {
              return getRelativePath(parserDisplayName, reportFileName);
          public static String getPerformanceReportDirRelativePath() {
              return getRelativePath();
          private static String getRelativePath(String... suffixes) {
              StringBuilder sb = new StringBuilder(100);
              for (String suffix : suffixes) {
              return sb.toString();
           * <p>
           * Verify if the PerformanceReport exist the performanceReportName must to be like it
           * is in the build
           * </p>
           * @param performanceReportName
           * @return boolean
          public boolean isFailed(String performanceReportName) {
              return getPerformanceReport(performanceReportName) == null;
          public void doRespondingTimeGraph(StaplerRequest request,
                  StaplerResponse response) throws IOException {
              String parameter = request.getParameter("performanceReportPosition");
              AbstractBuild<?, ?> previousBuild = getBuild();
              final Map<AbstractBuild<?, ?>, Map<String, PerformanceReport>> buildReports = new LinkedHashMap<AbstractBuild<?, ?>, Map<String, PerformanceReport>>();
              while (previousBuild != null) {
                  final AbstractBuild<?, ?> currentBuild = previousBuild;
                  parseReports(currentBuild, TaskListener.NULL, new PerformanceReportCollector() {
                      public void addAll(Collection<PerformanceReport> parse) {
                          for (PerformanceReport performanceReport : parse) {
                              if (buildReports.get(currentBuild) == null) {
                                  Map<String, PerformanceReport> map = new LinkedHashMap<String, PerformanceReport>();
                                  buildReports.put(currentBuild, map);
                              buildReports.get(currentBuild).put(performanceReport.getReportFileName(), performanceReport);
                  }, parameter);
                  previousBuild = previousBuild.getPreviousBuild();
              //Now we should have the data necessary to generate the graphs!
              DataSetBuilder<String, NumberOnlyBuildLabel> dataSetBuilderAverage = new DataSetBuilder<String, NumberOnlyBuildLabel>();
              for (AbstractBuild<?, ?> currentBuild : buildReports.keySet()) {
                  NumberOnlyBuildLabel label = new NumberOnlyBuildLabel(currentBuild);
                  PerformanceReport report = buildReports.get(currentBuild).get(parameter);
                  dataSetBuilderAverage.add(report.getAverage(), Messages.ProjectAction_Average(), label);
              ChartUtil.generateGraph(request, response,
                      PerformanceProjectAction.createRespondingTimeChart(, 400, 200);
          private void parseReports(AbstractBuild<?, ?> build, TaskListener listener, PerformanceReportCollector collector, final String filename) throws IOException {
              File repo = new File(build.getRootDir(),
              // files directly under the directory are for JMeter, for compatibility reasons.
              File[] files = repo.listFiles(new FileFilter() {
                  public boolean accept(File f) {
                      return !f.isDirectory();
              // this may fail, if the build itself failed, we need to recover gracefully
              if (files != null) {
                  addAll(new JMeterParser("").parse(build,
                          Arrays.asList(files), listener));
              // otherwise subdirectory name designates the parser ID.
              File[] dirs = repo.listFiles(new FileFilter() {
                  public boolean accept(File f) {
                      return f.isDirectory();
              // this may fail, if the build itself failed, we need to recover gracefully
              if (dirs != null) {
                  for (File dir : dirs) {
                      PerformanceReportParser p = buildAction.getParserByDisplayName(dir.getName());
                      if (p != null) {
                          File[] listFiles = dir.listFiles(new FilenameFilter() {
                              public boolean accept(File dir, String name) {
                                  if(filename == null){
                                      return true;
                                  if (name.equals(filename)) {
                                      return true;
                                  return false;
                          collector.addAll(p.parse(build, Arrays.asList(listFiles), listener));
          private interface PerformanceReportCollector {
              public void addAll(Collection<PerformanceReport> parse);

      And then i changed the matching index.jelly to:

      <j:jelly xmlns:j="jelly:core" xmlns:st="jelly:stapler" xmlns:d="jelly:define"
      	xmlns:l="/lib/layout" xmlns:t="/lib/hudson" xmlns:f="/lib/form">
        <l:layout xmlns:jm="/hudson/plugins/performance/tags" css="/plugin/performance/css/style.css">
        <st:include it="${}" page="sidepanel.jelly" />
            <j:forEach var="performanceReport" items="${it.getPerformanceListOrdered()}">
              <h2>${%Performance Breakdown by URI}: ${performanceReport.getReportFileName()}</h2>
              <img class="trend" src="./respondingTimeGraph?width=600&amp;height=225&amp;performanceReportPosition=${performanceReport.getReportFileName()}" width="600" height="225" />
              <table class="sortable source" border="1">
                <jm:captionLine />
                <j:forEach var="uriReport" items="${performanceReport.getUriListOrdered()}">
                  <tr class="${h.ifThenElse(uriReport.failed,'red','')}">
                    <td class="left">
                      <a href="./uriReport/${uriReport.encodeUriReport()}">
                        <st:out value="${uriReport.getUri()}" />
                    <jm:summaryTable it="${uriReport}" />
                <tr class="bold">
                  <td class="left bold">${%All URIs}</td>
                  <jm:summaryTable it="${performanceReport}" />

      The code could be a bit cleaner and better shared with the PerformanceProjectAction that also generate graphs but the output is fine I think.

  7. Unknown User (

    I generate the jtl-files in my junit tests, to measure some response time in acceptance tests (using selenium webdriver).
    I get the Performance Report tables all right for each build in Hudson, but I don't get the trend graphs for the job. I see the names of the files, but the frames for the graphs on the performance trend page are empty. If I click the link "Trend report" I get a stack trace. The error says that there is a parse error (java.lang.NumberFormatException: null) in my generated files, but not what is wrong. So my guess is some attribute is perhaps missing, that is needed to produce the trend that is not needed in the tables.
    Here is an example of a file I have generated:

    <?xml version="1.0" encoding="UTF-8"?>
    <testResults version="1.2">
     lb="{By.xpath: /*}"
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
     lb="{WebElement {By.xpath: //*[@id='loadingIcon']} is not visible}"
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
     lb="{By.xpath: //*[@id='loadingIcon']}"
    <samplerData class="java.lang.String">se.dreampark.test.BasicSmokeTest</samplerData>
    The stack trace:
    java.lang.NumberFormatException: null
    	at java.lang.Long.parseLong(
    	at java.lang.Long.valueOf(
    	at hudson.plugins.performance.JMeterParser$1.startElement(
    	at javax.xml.parsers.SAXParser.parse(
    	at javax.xml.parsers.SAXParser.parse(
    	at hudson.plugins.performance.JMeterParser.parse(
    	at hudson.plugins.performance.PerformanceReportMap.(
    	at hudson.plugins.performance.PerformanceBuildAction.getPerformanceReportMap(
    	at hudson.plugins.performance.PerformanceProjectAction.getTrendReportData(
    	at hudson.plugins.performance.PerformanceProjectAction.createTrendReport(
    	at hudson.plugins.performance.PerformanceProjectAction.getDynamic(
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
    	at java.lang.reflect.Method.invoke(
    	at org.kohsuke.stapler.Function$InstanceFunction.invoke(
    	at org.kohsuke.stapler.Function.bindAndInvoke(
    	at org.kohsuke.stapler.MetaClass$13.dispatch(
    	at org.kohsuke.stapler.Stapler.invoke(
    	at org.kohsuke.stapler.MetaClass$13.dispatch(
    	at org.kohsuke.stapler.Stapler.invoke(
    	at org.kohsuke.stapler.MetaClass$7.doDispatch(
    	at org.kohsuke.stapler.NameBasedDispatcher.dispatch(
    	at org.kohsuke.stapler.Stapler.invoke(
    	at org.kohsuke.stapler.Stapler.invoke(
    	at org.kohsuke.stapler.Stapler.service(
    	at javax.servlet.http.HttpServlet.service(
    	at winstone.ServletConfiguration.execute(
    	at winstone.RequestDispatcher.forward(
    	at winstone.RequestDispatcher.doFilter(
    	at hudson.util.PluginServletFilter$1.doFilter(
    	at org.jvnet.hudson.plugins.greenballs.GreenBallFilter.doFilter(
    	at hudson.util.PluginServletFilter$1.doFilter(
    	at hudson.util.PluginServletFilter.doFilter(
    	at winstone.FilterConfiguration.execute(
    	at winstone.RequestDispatcher.doFilter(
    	at winstone.FilterConfiguration.execute(
    	at winstone.RequestDispatcher.doFilter(
    	at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(
    	at org.acegisecurity.providers.anonymous.AnonymousProcessingFilter.doFilter(
    	at org.acegisecurity.ui.rememberme.RememberMeProcessingFilter.doFilter(
    	at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(
    	at org.acegisecurity.ui.basicauth.BasicProcessingFilter.doFilter(
    	at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(
    	at winstone.FilterConfiguration.execute(
    	at winstone.RequestDispatcher.doFilter(
    	at winstone.RequestDispatcher.forward(
    	at winstone.RequestHandlerThread.processRequest(
    1. Unknown User (

      Today I found the problem. Some of the builds had errors in the generated jtl-files. When I removed those builds, it worked. I am still a bit puzzled, since the error above came even if I applied a filter, to show the trend for only the latest 2 builds (i.e. not included the builds with corrupt jtl-files). That indicates that the plugin read jtl-files from all builds, even if I apply a filter.

  8. Unknown User (

    Is there a way to include Throughput or any other variables into the performance graphs? I understand that information is included in the jtl files, but I'm unsure of a setting to enable Throughput and other metrics and incorporate them in the Hudson performance charts. I'm also very new to the xslt format, so any advice would help immensely.


  9. What about adding a column with requests/seconds for JMeter scripts ?

  10. Unknown User (senthil.m)

    Is there a way to add/edit graph metrics? I would like to see throughput & 90%ile data and remove min & max information.

  11. Unknown User (christian_et)

    Hi everybody,

    I also couldn't get JUnit reports to work, I always got the same results as you described above.

    I had a look into the source code and found the JUnit support seems not to be finished yet. The way it is implemented right now, JUnit test reports must reside under <build-dir>\performance-reports\hudson.plugins.performance.JUnitParser$DescriptorImpl but the files are not copied there automatically. I tried correcting the source code, but I still could not get it working completely.

    The easiest way to get this plugin running with JUnit reports is the following: In (line 62) replace JMeterParser with JUnitParser. (Note that then the plugin doesn't work with JMeter anymore.) Additionally, you have to disable one unit test, which would otherwise prevent you from building the plugin: on the bottom of comment-out the wc.getPage() calls.

    This is just a workaround, combined support for JMeter and JUnit requires a bit more work. I'd be happy to help if I can!

  12. We've got many huge jmeter reports and Jenkins could be locked when performance plugin is reparsing them.

    Why not store a copy of parsing under performance-reports folder to save this time consuming task ?

  13. Hi everyone,

    I have discovered this plugin and it serves me well. It's a great tool!

    I don't use it in a "standard" way though... I hacked up a script which measures the number of SQL queries my testsuites make and exports that to a .jtl file. I can then visualize the performance of my code optimization in terms of number of requests made to the DB. I basically use the delay field of jtl files to insert my request number. So each test produces an entry in a jtl file, I have 4 jtl files (one for INSERT, DELETE,UPDATE and SELECT).If anyone is interested, I can share the tools I hacked for this. I use them under linux with a postgresql database, but I am sure they could be adapted to other databases/OSs as they aren't very complex. These are command line tools, so no need to integrate anything in your code except a system call every now and then to mark the logs. I'll give more details by mail if you are interested.

    Here are a few suggestions for this plugin which would make the overall experience better:

    - could I point to a master JTL file which I would like to put up on the front page (when there is only one JTL file being parsed by the performance plugin it shows up directly there, i would like to be able to pick one of my JTL files as a "master" to put up on the front page)

    - it would also be nice to be able to choose which lines are drawn on the graphs. The 90% makes less sense for me than the max for instance. Also I have no use for the "error" graph, because of the way I hacked the format, I never get any fails

    - it would be great to be able to choose the labels to apply to the graphs, and the units, or at least override the defaults (for me especially, seeing how I mentally translate milliseconds into "number of requests", but for i18n purposes or other stuff, I am sure this would be useful)

    - during a rebuild, the performance trend cannot be seen. This would be nice though.

    - finally, for deployment purposes, it would be great if a performance.hpi could be provided directly instead of having to compile this by hand, this way automatic upgrades could be done from within jenkins, etc.

    I hope I posted these suggestions in the right place,

    Thanks again for a great plugin, really useful!


  14. Hi,

    First, thanks for this very useful plugin !

    In fact, I'm facing an issue in case I have unexpected exceptions raised during JUnit tests; The JUnit report looks then like following example:

    <?xml version="1.0" encoding="UTF-8"?>
    <testsuites name="my-tests" tests="9" errors="2" failures="0" ignored="0">
    <testsuite name="foo" time="0.56">
    <testcase name="myTtest" classname="bar" time="0.266">

    In such a case, I have "errors=2", so I would expect to be able to use the "performance plugin" to force the build to be marked as "failed", but I can't.
    Since the header contains "failures=0", the build is always considered as Ok ...

    Would it be possible to add an option to check the field "errors" as well, and to mark the build as failed if percentage of errors is higher as a given threshold ?

    Many thanks,

    1. Hi Bernard, I completely agree - so I created an issue for it:

      Hopefully, this can be prioritized in the near future...though I see that the last release date was April 2012...

      1. It will be very helpful to mark build as failed if previous "average "Responding Time" (which shown on "Performance Trend" graphs) too different from current.

        (sorry for my English)

  15. Hi,

    Thanks for that great plugin :)

    I am using Performance plugin to display JMeter results.
    Is it possible to modify it to use Median values instead of Average values when "Performance Per Test Case Mode" is checked ?
    For example, by changing public void doRespondingTimeGraphPerTestCaseMode (?)

  16. This is fantastic plug-in and I'm rolling it out for my team to capture performance baseline of our functional tests. I only wish that there was any help for relative threshold configuration. Neither context help nor this page provides any help for the latest version.

    What do "Unstable % Range" and "Failed % Range" exactly mean? Do "(-)" values actually mean negative percentages (i.e. improvement of performance) and why would I ever want to fail build based on negative values? Does "-" mean minimum and "+" - maximum? Can build be failed based on test outliers (e.g. a single non-performing test) or performance of the entire test suite?

    Figuring these things out is quite hard for me and it's basically trial and error on a 15 minute job.

  17. It took me 11 builds. A bunch of those failed on performance improvements, which does not make sense to me. In the end, I figured out how to configure thresholds to mark builds unstable if performance degrades more than 25%:

    • Mode = Relative Threshold
    • Unstable % Range: -999 to +25
    • Failed % Range: -999 to +999
    • Compare with Build number = a baseline build number, e.g. #18 here
    • Compare based on = Median response time

    I still have a couple of things to clarify. Could you explain:

    1. What is the median response time in this case, exactly? Median between what? There is only one test time for a particular test and a particular build (#18 here), so what does median mean here?
    2. If there are 2 tests with degradation above threshold, it is reported that the second one marked the build unstable. Why not the first one or ideally all tests above the threshold? See sample report from my build below. The first test exceeding threshold of 9% is "Multiple Search Criteria", but only the second one ("Network Analysis Choices") is reported.
    Performance: Percentage of relative difference outside -99.0 to +99.0 % sets the build as failure
    Performance: Percentage of relative difference outside -99.0 to +9.0 % sets the build as unstable
    Performance: Recording JUnit reports 'build/cucumber/*.xml'
    Comparison build no. - 18 and 26 using Median response time
    PrevBuildURI	CurrentBuildURI		PrevBuildURIMed		CurrentBuildURIMed	RelativeDiff	RelativeDiffPercentage
    Search by PI, case-insensitive	Search by PI, case-insensitive		44315			10324			-33991.0		-76.7
    Search by Organization, case-insensitive	Search by Organization, case-insensitive		92196			68074			-24122.0		-26.16
    Warning on all blanks in search	Warning on all blanks in search		1538			1509			-29.0		-1.89
    Search displays a message if nothing found and preserves user input	Search displays a message if nothing found and preserves user input		5097			4444			-653.0		-12.81
    Pagination of PI search results	Pagination of PI search results		18558			18705			147.0		0.79
    Select all	Select all		123646			128703			5057.0		4.09
    Multiple Search Criteria	Multiple Search Criteria		70983			83962			12979.0		18.28
    Topical Analysis Search with Co-PIs	Topical Analysis Search with Co-PIs		39866			42491			2625.0		6.58
    Sorting PI hitlist	Sorting PI hitlist		80044			79765			-279.0		-0.35
    Go to the geospatial analysis	Go to the geospatial analysis		607			608			1.0		0.16
    Geospatial Analysis Choices	Geospatial Analysis Choices		13620			13510			-110.0		-0.81
    Retain last analysis result	Retain last analysis result		3989			4060			71.0		1.78
    Clear last analysis result	Clear last analysis result		4850			4749			-101.0		-2.08
    Go to the network analysis	Go to the network analysis		600			632			32.0		5.33
    Network Analysis Choices	Network Analysis Choices		16895			18872			1977.0		11.7
    Go to the temporal analysis	Go to the temporal analysis		730			650			-80.0		-10.96
    Temporal Analysis Choices	Temporal Analysis Choices		11763			11781			18.0		0.15
    Go to the topical analysis	Go to the topical analysis		630			667			37.0		5.87
    Topical Analysis Choices	Topical Analysis Choices		37696			37631			-65.0		-0.17
    Generate visualizations and see them in history	Generate visualizations and see them in history		18414			18038			-376.0		-2.04
    Warnings for fiscal year selections	Warnings for fiscal year selections		12912			12920			8.0		0.06
    Check geocode for Geospatial Analysis	Check geocode for Geospatial Analysis		14249			14296			47.0		0.33
    "Next" button should be enabled when one or more PIs are selected	"Next" button should be enabled when one or more PIs are selected		10423			10449			26.0		0.25
    The label "Network Analysis Choices" made the build unstable
    Build step 'Publish Performance test result report' changed build result to UNSTABLE

    3. Is there ever going to be a difference between PrevBuildURI and CurrentBuildURI?

  18. I have a couple of issues in another job.

    1. Threshold analysis reports only the last unit test even though 22 tests are run and JUnit reports results just fine. Both JUnit publisher and Performance are configured to use build/test-results/*.xml.
    Also, Performance Trend reports on all 22 tests fine.

    Recording test results
    Performance: No threshold configured for making the test failure
    Performance: No threshold configured for making the test unstable
    Performance: Recording JUnit reports 'build/test-results/*.xml'
    Comparison build no. - 854 and 859 using Average response time
    PrevBuildURI	CurrentBuildURI		PrevBuildURIAvg		CurrentBuildURIAvg	RelativeDiff	RelativeDiffPercentage 
    testTopicalAnalysis	testTopicalAnalysis		2621			4197			1576.0		60.13
    The label "testTopicalAnalysis" made the build unstable
    Build step 'Publish Performance test result report' changed build result to UNSTABLE

    2. When threshold levels exceed certain value (99%, I think) "No threshold configured" is reported in console while thresholds are actually in effect. In this example, unstable thresholds were set up as: -999% to 25% and the build was failed to unstable.

  19. It appears I can't use a variable substitution in the report files field.  

    I have used a template which I copy for each new item which contains the value "test/${JOB_NAME}

    /target/jmeter/results/ViewResultsTree.jtl" in the 'Report Files' field.  The build fails telling me it can't find the file. It does not substitute the variable with the job_name.

    Console output:

    Performance: no JMeter files matching 'test/${JOB_NAME}/target/jmeter/results/ViewResultsTree.jtl' have been found. Has the report generated?. Setting Build to FAILURE

    1. Existing Jira for this issue. Fix also submitted for review.

  20. Hi,

    It might not be the right place to ask this question:

    I have added some log.error when my assertion fails in my jmeter script. When the assertion fails, the error message shows properly in jmeter.log. How can I have them showing in jenkins console when I run jmeter with this plugin?

    In the jenkins configuration, I run jmeter in shell this way:

    sh -n -t myscript.jmx -l $WORKSPACE/reports/smoke.jtl -j $WORKSPACE/reports/smoke.log -Djmeterengine.force.system.exit=true -Jresult_dir=$WORKSPACE/reports

    In the console, it's only showing the Summariser.


    1. I found the solution:

      In, set log_file=

      This will redirect the log to the consonle.

      Also remove the parameter -j $WORKSPACE/reports/smoke.log

  21. I have now a question regarding this plugin. Is it possible to filter the response time comparison between build with only the build that passed successfully the test functionally?

    There is no point in comparing response time if the build didn't work well functionally. 


  22. Is there any plans/work going on to allow this plugin to accept .har files that come from the Chrome Dev tools?
    Collecting Performance metrics using Seleinum/WebDriver -

  23. I'm using JMeter Summarizer report, but Throughput graph is always showing zero value. Does anyone else have the same issue?

  24. I have loaded 3 JMeter runs of data and the Project Trending report is not showing any data. Nothing in the Throughput or Percentage of Errors graphs and the Responding Time graph shows no data mapped but does show all the test case name.

    The only place any data shows is in the individual runs performance reports where all but the Responding Time graph, the average one, have data. The Responding Time Average Graph is empty.

    Can someone tell me what is wrong?   All the data is there for each URI entry so the other reports should have data I would think.

  25. Change log show 1.13, but there is no Git repo tag for that. Last git repo tag is 1.12

  26. Hi:

    I have cranked up the settings so that I can see in my .jtl file the request data, the response data -- in short, everything that I need to be able to debug a test failure.  But I don't see that information surfaced anywhere within Jenkins.  I click on the URI in the report, hoping to see that information there, but alas.  Should I be able to see this information in Jenkins?  Without having to resort to kicking off the gui and rerunning the tests?

  27. Hi

    Since has been fixed a new plugin release should be made. When can we expect a release?

    Kind regards,


  28. Hi

    Since has been fixed a new plugin release should be made. When can we expect a release?

    Kind regards,


  29. To the author of this plugin. It would be appreciated if the plugin was placed under some kind of license.

    MIT License is the most common for Jenkins plugins.

  30. All the Performance Trend Per Test Case graphs are broken for me. When I click on them, I get the following error:

    Caused by: java.lang.Error: Unresolved compilation problem: 
    	Messages cannot be resolved
    	at hudson.plugins.performance.TestSuiteReportDetail.createRespondingTimeChart(
    	at hudson.plugins.performance.TestSuiteReportDetail.doRespondingTimeGraphPerTestCaseMode(
    	at sun.reflect.GeneratedMethodAccessor436.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
    	at java.lang.reflect.Method.invoke(
    	at org.kohsuke.stapler.Function$InstanceFunction.invoke(
    	at org.kohsuke.stapler.Function.bindAndInvoke(
    	at org.kohsuke.stapler.Function.bindAndInvokeAndServeResponse(
    	at org.kohsuke.stapler.MetaClass$1.doDispatch(
    	at org.kohsuke.stapler.NameBasedDispatcher.dispatch(
    	at org.kohsuke.stapler.Stapler.tryInvoke(
    	... 71 more

    OK, I found a corresponding JIRA issue: I'll watch and hope for the resolution.

  31. It seems that version 1.13 is still affected by this bug:

    after downgrading to version 1.10 the problem was solved.

  32. Hi,

    I have this error when i'm using the plugin :

    ERROR: Exception while determining absolute error/unstable threshold evaluation The system cannot find the path specified
    	at Method)
    	at Source)
    	at hudson.plugins.performance.PerformancePublisher.perform(
    	at hudson.tasks.BuildStepCompatibilityLayer.perform(
    	at hudson.tasks.BuildStepMonitor$1.perform(
    	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(
    	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(
    	at hudson.model.Build$BuildExecution.post2(
    	at hudson.model.AbstractBuild$
    	at hudson.model.Run.execute(
    	at hudson.model.ResourceController.execute(

    Someone can help me please ?

    This is my configuration :

  33. Guys, wiki comments are not bugtracker, not support forum, please use appropriate channels to report issues.

  34. Hi,

    Just a quick question please.

    I have a second report (jtl file) in a same job of Jenkins.

    Do you know if I can check 2 reports (at the same time) with the performance plugin ?

    Or do you know how to check (with a simple command line) the number of errors present in the report (JTL file) ?

    I would like to detect, with jenkins, if I have at least one error in the 2nd report (In this case => status "unstable" will be set).

    1. Someone else answered to me about my question :

      "You can set 2 and more reports in "Source data file" field, that separate by semicolon:


  35. Hey,

    I'm have a Jenkins pipeline (Jenkins 2.6). In my Jenkinsfile I want to do a jmeter performace test, so I add these two commands to my pipeline:

    stage('jmeter test') {
    		sh (script: "jmeter -n -l performacetest.jtl -t src/test/jmeter/performacetest.jmx", returnStdout: false)
    		performanceReport parsers: [[$class: 'JMeterParser', glob: "performacetest.jtl"]], errorFailedThreshold: 1, errorUnstableThreshold: 1, ignoreFailedBuild: false, ignoreUnstableBuild: false, relativeFailedThresholdNegative: 0, relativeFailedThresholdPositive: 0, relativeUnstableThresholdNegative: 0, relativeUnstableThresholdPositive: 0

    This works fine so far. But if the performance test fails, the pipeline build doesn't fail. In the output I can see that the status of the performance test is failed:

    Performance: Percentage of errors greater or equal than 1% sets the build as unstable
    Performance: Percentage of errors greater or equal than 1% sets the build as failure
    Performance: File performacetest.jtl reported 90.909% of errors [FAILURE]. Build status is: FAILURE

    but jenkins doesn't stop the pipeline build, Jenkins ingnores the failure of the performance test and starts the next stage of the pipeline build.

    What is the correct config to exit the current pipeline build in jenkins immediately after the performance test fails?

  36. Hello,
    Just a few quick questions, on your github page I do not see a License file, is there a reason for that? Should there not be a MIT license like most Jenkins plugins? I am trying to compile the plugin myself but I am having some trouble with that. I keep the following error:

    Performance test: run [bzt, /rootWorkspace/jenkins-report.yml]
    09:09:45 INFO: Taurus CLI Tool v1.9.5
    09:09:45 INFO: Starting with configs: ['/rootWorkspace/jenkins-report.yml']
    09:09:45 INFO: Configuring...
    09:09:45 INFO: Artifacts dir: /rootWorkspace/2017-08-22_09-09-45.880817
    09:09:45 INFO: Preparing...
    09:09:45 ERROR: Config Error: No 'execution' is configured. Did you forget to pass config files?
    09:09:45 INFO: Post-processing...
    09:09:45 INFO: Artifacts dir: /rootWorkspace/2017-08-22_09-09-45.880817
    09:09:45 WARNING: Done performing with code: 1

    I used the following link to learn how to compile it: . Can anyone help me with this?

  37. I am able to generate trend. In performance report what is the significance and meaning of the powered numbers highlighted in red color in below image. Example -2, +5698, -6657 etc



  38. When slave is offline, I am not able to view JMeter report trend with this plugin.

    Any workaround for this ?

  39. Hello,
    I see trend graphs are plotted using cumulative values rather than individual builds. Example error count -
    I made 2 builds -
    First build 20 out of 20 txn passed. Error Count = 0
    Second build 10 of 20 txn passed. Expected Error count of build# is 50% but Jenkins shows 33.3 %
    Is there a way to display trends based on individual builds?
    Thanks, Ashish


    1. Yes, you can see trends based on each build. Click Performance Report link in your build. See screenshots  

  40. Many thank for this nice plugin!

    I'm using it for analyzing JUnit reports.

    The following two features seem me very useful:

    • ability to compare performance of certain build vs specific build, not just the previous one
    • ability to sort Performance report table by delta in percents. I.e. - I'd like to see the tests with improved/degraded performance



  41. Can someone post a sample Jmeter xml report? i have CPU usage on server side and i want to plot it out with this plugin, but i am not sure how to build the xml

  42. Hi,I have a "log.jtl"  file with a size of 5G,when parse this file,there is an error:

    ... end of run
    + echo '==========end load test=========='
    ==========end load test==========
    Performance: Recording JMeter reports '**/*.jtl'
    Performance: Parsing JMeter report file '/root/.jenkins/jobs/loadtest-1-machine/builds/113/performance-reports/JMeter/log.jtl'.
    Performance: Failed to parse file '/root/.jenkins/jobs/loadtest-1-machine/builds/113/performance-reports/JMeter/log.jtl': GC overhead limit exceeded
    java.lang.OutOfMemoryError: GC overhead limit exceeded
    	at java.util.Calendar.<init>(
    	at java.util.GregorianCalendar.<init>(
    	at java.util.Calendar$
    	at sun.util.locale.provider.CalendarProviderImpl.getInstance(
    	at java.util.Calendar.createCalendar(
    	at java.util.Calendar.getInstance(
    	at java.text.SimpleDateFormat.initializeCalendar(
    	at java.text.SimpleDateFormat.<init>(
    	at java.text.SimpleDateFormat.<init>(
    	at hudson.plugins.performance.parsers.AbstractParser.initDateFormat(
    	at hudson.plugins.performance.parsers.AbstractParser.parseTimestamp(
    	at hudson.plugins.performance.parsers.JMeterCsvParser.getSample(
    	at hudson.plugins.performance.parsers.JMeterCsvParser.parseCSV(
    	at hudson.plugins.performance.parsers.JMeterCsvParser.parse(
    	at hudson.plugins.performance.parsers.JMeterParser.parseCsv(
    	at hudson.plugins.performance.parsers.JMeterParser.parse(
    	at hudson.plugins.performance.parsers.AbstractParser.parse(
    	at hudson.plugins.performance.PerformancePublisher.locatePerformanceReports(
    	at hudson.plugins.performance.PerformancePublisher.prepareEvaluation(
    	at hudson.plugins.performance.PerformancePublisher.perform(
    	at hudson.tasks.BuildStepCompatibilityLayer.perform(
    	at hudson.tasks.BuildStepMonitor$1.perform(
    	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(
    	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(
    	at hudson.model.Build$BuildExecution.post2(
    	at hudson.model.AbstractBuild$
    	at hudson.model.Run.execute(
    	at hudson.model.ResourceController.execute(
    Performance: Percentage of errors greater or equal than 0% sets the build as unstable
    Performance: Percentage of errors greater or equal than 0% sets the build as failure
Write a comment…