Mock HibernateTemplate with Mockito


Mockito is a useful Java library to create mock Objects for Java Unit Testing.

I used the framework Mockito in order to test objects difficult to test otherwise such as Struts method, Servlet, Hibernate , etc…

HibernateTemplate object to mock

This is the legacy code i wanted to test with Mockito. HibernateTemplate is a deprecated class but it is still used in this code.

Package com.dao.myclass;

import org.springframework.dao.DataAccessException;

import com.dao.ImyDAO;
import com.dao.exception.MyException;
import com.dao.myEntity;
import com.util.Logger;

public class MyHibernateImpl extends HibernateDaoSupport implements ImyDAO {

    public myEntity getMyEntity(String myNumber) throws MyException {
        try {
            if (myNumber == null) {
                        .error("myNumber : " + myNumber);
                return null;
            if (myNumber != null && myNumber.length() > 1) {
            myEntity thismyentity = (myEntity) getHibernateTemplate().get("com.dao.myEntity", myNumber);
            return thismyentity;
        } catch (final DataAccessException e) {
            throw new MyException("error.CodeMyEntity", e);


The line of code i had to difficulty to test without Mockito :

myEntity thismyentity = (myEntity) getHibernateTemplate().get("com.dao.myEntity", myNumber);

The code for myEntity :

@Table (name="MY_TABLE")
public class myEntity implements Serializable {

    private static final long serialVersionUID = 5447862515324364L;

    private String myNumber;

    @Column (name="MYCOLUMN")
    private String myColumn;

    public String getmyNumber() {
        return myNumber;
    public void setmyNumber(String myNumber) {
        this.myNumber = myNumber;
    public String getmyColumn() {
        return myColumn;
    public void setmyColumn(String myColumn) {
        this.myColumn = myColumn;

Mock the object HibernateTemplate

I need to mock the object HibernateTemplate in order to test this function. Here is my test class with Junit :

public class TestMyHibernateImpl {

    HibernateTemplate mockTemplate;

I used the annotation @Mock for the creation of a mock object of HibernateTemplate. To use @Mock it is necessary to initialise MockitoAnnotations :

    public void initMocks() {

This is how I test the function getMyEntity:

    public void testCasEntityReturned() throws MyException {

        myEntity entity = new myEntity();
        String myNumber = "S54564121SD”;
        String myColumnValue = "myColumn";

        (Mockito.doReturn(entity).when(mockTemplate)).get("com.dao.myEntity", myNumber);
<b>// I tell Mockito to return an entity when this code is used later in getMyEntity(..)</b>

        MyHibernateImpl myColumn = new MyHibernateImpl();
        myEntity thismyentity = myColumn.getMyEntity(myNumber);
        Assert.assertEquals(myNumber, thismyentity.getmyNumber());
        Assert.assertEquals(myColumnValue, thismyentity.getmyColumn());


SQLLOADER has “ORA-01722: Number not valid ” with CSV File

The problem

While importing a csv file with sql loader i has this error :

Record 1: Rejected - Error on table MY_TABLE, column MY_COLUMN3 .
ORA-01722: Number not valid

Record 2: Rejected - Error on table MY_TABLE, column MY_COLUMN3 .
ORA-01722: Number not valid

  1 Row successfully loaded.
  11 Rows not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.

I spent hours on this problems .

My configuration

My CTL file :


My table :

       ID                           NUMBER GENERATED BY DEFAULT AS IDENTITY,            
          MY_COLUMN1             FLOAT                      NOT NULL ,                 
MY_COLUMN2             FLOAT                      NOT NULL,
MY_COLUMN3             FLOAT                      NOT NULL,                 
    PCTFREE 10                                                                           

My CSV file :


The solution

First i checked what the error meant :

This error can mean there are blanks in the file but after checking the csv file there was none. However I noticed the last column MY_COLUMN3 causing problems had negative values. But according to oracle documentation the type Oracle FLOAT can handle negative values. And the ctl was correctly configured to handle negative values with MY_COLUMN3 DECIMAL EXTERNAL. So what is going on ?

Finally i discovered that after manually modifying the CSV file in the unix server the problem was solved ! So i figure out the problem was not coming from ctl file but from the csv file.

After many trials and errors I found out the file was a dos file

vi /data/myfile.csv
</data/myfile.csv" [noeol][dos] 13L, 832C

Actually this csv file is a file coming from Excel document on Windows and being transferred to the unix server.

So I tried the command dostounix and it fixes my issue !!!

Automate Unit testing of Javascripts with Karma Runner

The problem :

Working on TDD/BDD method with Javascripts is more tricky than with Java language. In the Java world you have Junit to test your unit tests and maven to execute all of them and create easily reports in Jenkins. In the Javascript world  the equivalent to run Junit is Qunit,Jasmine, Mocha . There is more tools because there is more complexity and these tools do not work perfectly . Not only you have different style of Unit testing but also different Test runner too. In Java you used just the one given by default.

Initially I was using Qunit with a maven plugin to run the tests. But it meant i could not use Jasmine 2 or Mocha. Qunit has limitations too that’s why i moved to Jasmine 2.0.

Now I use Karma test runner to run the Jasmine Unit Tests . But i can also run Qunit tests if I would like to. Karma is really flexible and can also run Mocha. It can also exclude or include some files to test them.  The other complexity of Javascript testing is the browser. Indeed Javascript can be run inside the Chrome,Firefox, Internet Explorer Browsers . For CI testing I used a headless browser called PhantomJs.

Also, Karma has a plugin to analyse Javascript code coverage.
This is the  best generic tool runner i found so far to run Javascript unit tests.

The solution :

I will present the solution I implemented to launch Javascript tests from Jenkins platform. Karma is launched From Jenkins. Also Karma is configured to find Javascripts Jasmine tests and generate a Junit report style, code coverage report.

Realise Unit test with Jasmine 2.0

You need to write specification .spec file describing the test. Example for CommonUtilSpec.js

describe("CommonUtil", function() {

it("trim testing ", function() {
var trimmed = trimFunc('tri');



The documentation for Jasmine 2 :

Install tools on the Jenkins machine

Install Node.js like in this link :

Get as well yeoman and bower to get javascript packages

NOTE : Bower is an equivalent de maven for javascript

Configure Karma configuration for Jenkins

I have used the following link to configure Karma for Jenkins The configuration for Karma of this link have compilation errors.

I have installed EnvInject in order to specify some environment variables for Jenkins.

Configure Karma for Jenkins. I have used the information for the junit reporter at . It does junit style unit test reporting. It is very practical. I have also used the coverage plugin in order to know which part of the code have been tested .

The content of my file :

module.exports = function(config) {

// base path that will be used to resolve all patterns (eg. files, exclude)
basePath: '',

// frameworks to use
// available frameworks:
frameworks: ['jasmine'],

// list of files / patterns to load in the browser
files: [

// list of files to exclude
exclude: [

// preprocess matching files before serving them to the browser
// available preprocessors:
preprocessors: {
'/myproject/src/main/**/*.js' : ['coverage']

// test results reporter to use
// possible values: 'dots', 'progress'
// available reporters:

reporters: ['progress', 'junit','coverage'],

// the default configuration
junitReporter: {
outputDir: '', // results will be saved as $outputDir/$browserName.xml
outputFile: 'test_jasmine_js.xml', // if included, results will be saved as $outputDir/$browserName/$outputFile
suite: '', // suite will become the package name attribute in xml testsuite element
useBrowserName: true, // add browser name to report and classes names
nameFormatter: undefined, // function (browser, result) to customize the name attribute in xml testcase element
classNameFormatter: undefined, // function (browser, result) to customize the classname attribute in xml testcase element
properties: {} // key value pair of properties to add to the section of the report

// web server port
port: 9876,

// enable / disable colors in the output (reporters and logs)
colors: true,

// level of logging
// possible values: config.LOG_DISABLE || config.LOG_ERROR || config.LOG_WARN || config.LOG_INFO || config.LOG_DEBUG
logLevel: config.LOG_INFO,

// enable / disable watching file and executing tests whenever any file changes
autoWatch: true,

// start these browsers
// available browser launchers:
browsers: ['PhantomJS'],

plugins : [

// Continuous Integration mode
// if true, Karma captures browsers, runs the tests and exits
singleRun: true,

// Concurrency level
// how many browser should be started simultaneous
concurrency: Infinity

Configure and launch Jenkins

I configured jenkins to launch karma with shell script plugin. This solution is not optimized because the npm plugins are being installed at each launch. These plugins should be installed once.

npm install karma-jasmine --save-dev
npm install jasmine-core --save-dev
npm install karma-phantomjs-launcher --save-dev
npm install karma-junit-reporter --save-dev
npm install karma-jasmine-html-reporter --save-dev
npm install karma-html-reporter --save-dev
cp /home/myuser/myapp my.jenkins.conf.js .
karma start my.jenkins.conf.js

Here is a picture of the configuration in Jenkins :


Once you have finished the configuration, launch the tests. The result of the report should look like that


The plugin coverage is really neat and show in details where the test were not covered  :




Error 1 : No provider for “framework:jasmine”! (Resolving: framework:jasmine
Correction : npm install karma-jasmine –save-dev

Error 2 : Error: Cannot find module ‘jasmine-core’
Correction : -npm install ‘jasmine-core –save-dev

Error 3 : Cannot load browser “PhantomJS”: it is not registered! Perhaps you are missing some plugin?
Correction: npm install karma-phantomjs-launcher –save-dev

Error 4 : Can not load reporter “junit”,

plugins : [

Error 5 : Cannot find plugin “karma-junit-reporter
Correction: -npm install ‘karma-junit-reporter –save-dev

Error 6 : reporter.junit Cannot write JUnit xml
Problem was specific to where my conf file was located.The fix was to copy the conf file locally in working directory of the slave.

Avoid full queues with MQ Series

Introduction :

The most reliable source of information for Websphere MQ is IBM documentation. But there is other sources that are difficult to find with Google.

The documentation is available online for the recent version of Websphere MQ Series : Choose correctly the version of Websphere MQ Series. It is also possible to download online documentation.

Other sources of information are forum. It seems there is only few communities out there but they are very active :

The problem :

We have an application in production which sends message to an Websphere output queue. A partner application reads message from this queue and replied to the input queue of our application.

The input queue keeps receiving messages which are not consumed in production. In the end the input queue is full and our application would responds with a timeout.

The code uses JMS API to send/receive messages from Websphere MQ.We send the initial message and then wait for a reply from the partner application.

If we don’t receive messages from the partner we send an abort message to announce we stop communicating with the partner application. When we send the abort request sometimes we don’t receive any message back from the partner. Later the partner sends a response in the queue but the message is not consumed because the process is finished.

 sendJMSMessage(outputQueue, request);

            boolean cancelReadMessage = false;
boolean isTimeoutOccured= false;

            while (!cancelReadMessage ) {
                long startTime = System.currentTimeMillis();

                final Message message = receiveMessage(inputQueue, id);

                long timeOutPartner = getJmsTemplate().getReceiveTimeout() - (System.currentTimeMillis() - startTime) &lt;= 0 ? JmsTemplate.RECEIVE_TIMEOUT_NO_WAIT : getJmsTemplate().getReceiveTimeout() - (System.currentTimeMillis() - startTime);

                if (message != null) {
} else {
                    if (!sessionId.equals(&quot;-1&quot;) &amp;&amp; !isTimeoutOccured) {
                    final String abordtRequest = PartnerUtils.buildAbortRequest(contextId, contextCode, sessionId);
                        sendJMSMessage(outQueueName, abordtRequest );
                        isTimeoutOccured = true;
                    } else {
                    // NO MORE MESSAGES STOP THE LOOP
                        cancelReadMessage = true;



After analysis there are several solutions possible but we choose only one because some were not possible at that time :

Temporary Fix to clear the queue only once.

This solution does not fix the problem for good. The next day the problem is the same and the queue need to be cleared again manually.

How to Clear a MQ Queue from a Script or Program

Use CAPEXPRY to set an expiry date on all messages entering the queue.
This solution is only available for MQ version MQ

Example of command :

Ask the partner application to send abort messages with a time to live(ttl) tag.

Therefore the abort message would die on his own after the expiry date. This solution is not possible because the partner cannot modify the code at the time.

Example in JMS how to add a tll message

public Message createMessage(Session session) throws JMSException {
                           message = session.createTextMessage(messageContent);
                           return message;

Finally the fix was to add an additional timeout when waiting for abort messages.

We find out we were not waiting long enough for the partner in the case of abort messages. We have added an additional timeout to wait for the reponse.

sendJMSMessage(outQueueName, request);

Boolean timeoutabort = getTimeoutAbort(); 
            boolean cancelReadMessage = false;
            boolean isTimeoutOccured= false;

            while (!cancelReadMessage ) {
                long startTime = System.currentTimeMillis();

                final Message message = receiveMessage(inputQueue, idSelector + &quot; = '&quot; + messageId + &quot;'&quot;);

                long timeOutPartner = getJmsTemplate().getReceiveTimeout() - (System.currentTimeMillis() - startTime) - timeoutabort &lt;= 0 ? JmsTemplate.RECEIVE_TIMEOUT_NO_WAIT : getJmsTemplate().getReceiveTimeout()
                        - (System.currentTimeMillis() - startTime);

                if (message != null) {
} else {
                    if (!sessionId.equals(&quot;-1&quot;) &amp;&amp; !isTimeoutOccured) {
                       // TIMEOUT OCCURED SEND ABORT MESSAGE
                        final String abortRequest = PartnerUtils.buildAbortRequest(contextId, contextCode, sessionId);
                        sendJMSMessage(outQueueName, abortRequest );
                        isTimeoutOccured= true;
                    } else {
                        cancelReadMessage = true;


The downside of this solution is that we wait longer now for the partner that before but the queue do not get full anymore.

Clean up Jenkins Workspaces

Problem :

When you use Jenkins for continuous integration, you can quickly have disk usage problems on slave nodes and sometimes with the master node too. Disk usage problems happen when you have many jobs . For example one of our jobs is taking up to 2GB. Therefore where Jenkins sits we might have disk usage problems.

Delete Workspace when build is done

For my projects I use often this option to delete workspace after build :

Disadvantage :

After build, sometimes it is useful to keep the workspace  in order to understand failures. You can always decide to suppress this option temporarily if there is a problem.

Clean up Slaves Workspaces

Slaves workspaces are not deleted by this method. Therefore i use as well a script to delete slave workspaces :

To execute the script on the master node, go to “manage Jenkins” -> “ Console  Script”.

Comment the line “workspacePath.deleteRecursive()” if you want to verify which folders are going to be deleted  .

Write a clean up job for Jenkins :

It is a good practice to clean up Jenkins’s Workspaces.

Write jobs for your maintenance tasks, such as cleanup operations to avoid full disk problems.

Go to “Manage Jenkins”-> “Manage Plugin”.After reboot of Jenkins the plugin can be used.
  • Secondly set up and configure Groovy. Go to Manage Jenkins -> Configure System where you can configure Groovy. Then start to follow the steps indicated on the wiki of the Groovy  plugin(previous link).

Few tips how I installed Groovy when following the steps of the wiki

Use repo URL by extracting it from zip file :

Choose a label for the Groovy installation. For example it can be Groovy.

Download Url for binary archive :

  • Then you can just create your Jenkins job to clean up daily workspaces. In configuration of the job you choose “Execute System Groovy script”.
Put the previous command from gist.github here
  • This Jenkins job saved me lot of time ! Since then i did not have disk problem ! I don’t need anymore to clean up the disk manually.

Clean up Master Workspace

It happens from time to time that the master node is full. It is necessary to clean up the workspace to launch jobs. In order to do so go to the console of the master node first.

Then choose the option ‘Script Console’ in the master node. Then i simply look for jobs which are taking the most of space.

In order to do so first find the directory of JENKINS_HOME  with this command for example :

println "env".execute().text

Find out which jobs are taking the most spaces with :

println "du -mh [JENKINS_HOME]/jobs".execute().text

Suppress the directory taking the most space(you need admin rights)

Example ::

println "rm -rf   [JENKINS_HOME]/jobs/[my_job_to_clean_up]/builds".execute().text

Link to the book “The Pragmatic Programmer” on Amazon

The Pragmatic Programmer: From Journeyman to Master

I advised this book because it helped me to understand the big picture of informatics :

Other commands useful

Check where is located the path of the slaves directory  :

def hi = hudson.model.Hudson.instance
hi.getItems(hudson.model.Job).each {
job ->

Delete unused jobs :

How to create groovy script which clean up workspaces :

List of groovy scripts for Jenkins

Wipe out workspaces for a specific job :

Audit security of software

I Introduction


I had the opportunity to analyse the security of some applications.

Basically I decompose the analysis in three steps by inspiring myself to the reference OWASP. Firstly I need to know what are the important data of the application. It is best to ask questions to the team who realised the application. I also need to test the application and understand it, get access to the code, look at the specification working.


Then i will use a tool  to do a pen test. The aim is to evaluate the application by trying to exploit vulnerabilities. Zap Proxy is a popular tool and free. I recommend also Acunetix but you need to pay. Also i used findbugs with all security options ticked to detect some security flaws..


I was given a deadline to make an audit of an application. Therefore the aim was not find all security problems of the application but the most importants ones.
Over the years the top ten security problems did not change. So we need to be particularly looking at these potential problems



I Know the application

At first we want to know if it is really worth the effort to do a full/partial audit.Read OWASP reference as a guideline for the code review[1].  Before starting a full audit we need to know some important aspects about the application !


  1. Code: The language(s) used, the features and issues of that language from a security perspective. The issues one needs to look out for and best practices from a security and performance perspective.


  1. Context: The working of the application being reviewed. All security is in context of what we are trying to secure.


  1. Audience: The intended users of the application, is it externally facing or internal to “trusted” users. Does this application talk to other entities (machines/services)? Do humans use this application?


  1. Importance: The availability of the application is also important. Shall the enterprise be affected in any great way if the application is “bounced”[1]

Before meeting the team we need to make a checklist to know better the application :

The checklist should cover the most critical security controls and vulnerability areas such as:

  • Data Validation
  • Authentication
  • Session management
  • Authorization
  • Cryptography
  • Error handling
  • Logging
  • Security Configuration
  • Network Architecture

Input, for example, can be:

  • Browser input
  • Cookies
  • Property files
  • External processes
  • Data feeds
  • Service responses
  • Flat files
  • Command line parameters
  • Environment variables

Exploring the attack surface includes dynamic and static data flow analysis: Where and when are variables set and how

the variables are used throughout the workflow, how attributes of objects and parameters might affect other data

within the program. It determines if the parameters, method calls, and data exchange mechanisms implement the

required security

Read the documentation about the product.  Is there any security requirement ? Is there any sensitive data ? How can we communicate to the product(graphical interface ? Web services  ? etc…)
Talk to the team and go through a check list. The Owasp code review give a good idea about what to ask to the team. Example of questions :



–       What is the programming language of the application ?

–        What are the security requirement ?

–        What are the sensitive data ? Do you have a documentation about the architecture of the application ?

–          How can we access to the application ? Web Services? Graphical Interfaces  ?

–          Data input are they being validated ?

–          Where can we access the application ? If the application is only being accessible in a private network with no connection internet the need of security is not as important as an application accessible by everybody on the web.

–          Who are the users of the application ?The general public or users with special rights  ?

–          What is the authentication to the application ?



In the end talking to the team and reading the document is not enough. The code of the application is the best source of information.When you understand better the application, you can evaluate what kind of  security audit you should do.


For example It is crucial to have a bank application very secure on the web. However it might not be useful to do a full audit for an application with no important

II Tools to scan the application

Pen test tool

In order to perform a pen test , there is a number of tools you can use. Zap proxy is a popular tool and free. Acunetix can find more security bugs but it is not free.


For some websites it is difficult to access to all the links. It is important to retrieve the majority of the links of the website. Therefore sometimes it is necessary to manually crawl all links.


This link is very useful to manually crawl all links in Acunetix:

The procedure is similar with other tools in the market :


Configure the web browser

Presuming that the web browser is running on the same machine where the tools is installed, set the proxy server IP to and the proxy server port to 8080.

  1. Start the HTTP Sniffer and browse the website using the previously configured web browser.
  2. Once ready, stop the HTTP sniffer. Save captured data by selecting ‘Save Logs’ from the Actions drop down menu.

In the Site Crawler node, click the ‘Build Structure from HTTP Sniffer log’ button (highlighted in the above screen shot) to import the captured data into the Site Crawler.

It is also possible to import HTTP Sniffer logs to an already existing scan, or import multiple HTTP Sniffer logs into the same crawl. To do so, simply tick the option “Merge the log9s0 with the currently opened crawl results in the HTTP Sniffer Log import window as highlighted below.


  1. Import Logs to Crawler
  2. Save the crawler import results by selecting ‘Save Results’ from the Actions drop down menu.
  3. Launch the Scan


Click on the New Scan button to launch the scan wizard.  In the first step of the Scan Wizard select the option ‘Scan using saved crawling results’ as highlighted in the above screen shot.  Proceed with completing the scan wizard to launch the automated scan against the manually browsed website.


Bug detection tool

Sonar : It can detect important security problem.

Findbugs : tick all options before scanning (by default security option aredeactivated).


III Manual testing

Read OWASP reference as a guideline for the security verification[2].

This phase can be the longest if there are lot of problems with the application. We need to do some verification requirements manually to certify the level of security of the application.

DIfferent Level of Security 0 to 3 (low to high security) In order to meet one of these levels an application need to pass some manual tests.[2]


In details this is the list of the security requirements :


V2. Authentication

V3. Session Management

V4. Access Control

V5. Malicious Input Handling

V7. Cryptography at Rest

V8. Error Handling and Logging

V9. Data Protection

V10. Communications


V13. Malicious Controls

V15. Business Logic

V16. File and Resource

V17. Mobile


In practice, I would verify manually the results of the scans. The scans I refer can be done with the tools previously mentioned (example : findbugs, sonar and zap proxy) . The scans will show a list of problems. We need to make sure these security problems are real and not false positive. Concentrate on the majors problems before going through the minor security breaches. Sometimes several minor security problems can lead to serious security holes.

IV Reporting


The last step of an audit is to realise a document to help the team of the application to fix the main security bugs. In the report I would  recommend to put the main security breaches. From my experience, the teams do not have time to spend lot of time in fixing security issues. It is best to focus on big security breaches and the one easy to fix first.




[1]Code Review :OWASP_Code_Review_Guide-V1_1.pdf

Available to download here :



Available to download here :

Using Jmeter with Jenkins

The problem :


A performance problem was found on one pre-production platform while running manually Jmeter. This problem was found after the release was done which postpone our release.


The solution :

Performance tests should be done daily as part of the continuous integration. Thus we can discover problems of performance easily and fix earlier major problems.
Assuming you have already have a jmx file(Jmeter file), you need to call this file from maven with Jenkins . The second step is to parametrize variables in Jmeter in order to modify some variables from Jenkins ( very useful).


Call Maven from Jenkins


The configuration of the job Jenkins to call maven from Jenkins :

Configuration Build


Some variables have been parameterized such as NUMBER_ITERATION and NUMBER_USER . These Jenkins variables are being pass to maven variables such as number.user.jenkins and


Therefore it is possible to modify those variables when launching Jmeter from Jenkins manually. The default values are used when the job is launched periodically.



I used the plugin “Publish Performance test result report” to report performance and detect bugs. There are other reporting tools in Jenkins for Jmeter.


Call Jmeter from Maven  to pass variables


This article was very useful to mavenify Jmeter :

I chose the plugin jmeter-maven-plugin to do it. I used the same method as in the article.I just added commons-logging as dependency to make it work :



This list of variables are passed to the Jmx file :


Modify JMX file for parametrization


The final piece of work is to parametrize the actual JMX file. I found informations how to parametrize this here :


After few tests I manage to find the exact wording to pass variables from maven to Jmeter. An example will tell more. For example you can define the number of thread and loop in “Thread Group”. Number of threads = ${__P(number.user,7)}. The default value is 7 if number.user is not passed to the JMX file.


Also here is the actual soap request with the url and port parametrised :

As you can see some variables are inside the request ${input1} and ${input2}. They are actually being fed from a CSV file. Tutorial :


In Jmeter, the CSV file is being loaded from “CSV Data Set Config”. Example of CSV configuration for Jmeter  :

Variable Names (comma-delimited) : input1,input2

FileName                                            : resources/${__P(csvfile,default.csv)}

FileName is where the csv file is located in my arborescence.

The CSV file contains the data used when calling the SOAP request. Example of CSV file :





At the first iteration, the first line of the CSV file will be used. Therefore a SOAP request will be send with input1=data1 and input2=test1. Then the next line will be used and so on .





At first I did not know why variables would not be pass to the JMX file from Jenkins. There was no errors in the logs “target\jmeter\logs” and the jtl file was not generated.I had to look in target\jmeter\bin to check that my variables were passed in the files.


My problem was that the names of my variables were different from the variables names in JMX. I resolved my issues by checking the user file.


User variables defined in maven as <propertiesUser> are passed in this file target\jmeter\bin\