Bu Blogda Ara

22 Aralık 2010 Çarşamba

Memcached Performance Tests and Source Code (Java Source, Ant Project)

What is Memcached?

I have prepared some test cases in order to test the performance of Memcached distributed caching library. You can get them from my code repository with the following mercurial script.


hg clone https://barisergunexperimentalstuff.googlecode.com/hg/ barisergunexperimentalstuff


For your information if you don't have a mercurial rep client than you can get it with your favorite package manager. I use "sudo apt-get install tortoisehg" on my Ubuntu.

A-) Local tests:

Local tests are done with a sample DataObject. when you run the tests 8 different memcached instances are started on 8 consequent ports.

Steps to follow

- Install memcached (eg. sudo apt-get install memcached). On my previous post I also have information about on installing memcached on Centos 5.5

- Then, run tests with below ant script


ant local.test -Dentry.count=10000 -Dttl=60


Dentry.count is used to configure number of objects to cache. Dttl is used to configure TTL for each object in the cache.

A-) Distributed tests:

Local tests automatically starts and terminates the memcached servers on local machine. For distributed tests you will have to start the memcached servers on remote machines and give the server_ip:server_port combinations with a specific format as shown below.

Steps to follow:

- Install memcached on remote servers

- Start memcached servers.

Eg:


memcached -d -u root -m 1024 10.34.3.13 -p 1121


- Run distributed tests with following arguments(just a sample)

ant distributed.test -Dentry.count=10000 -Dget.timeout=10 -Dttl=300 -Dservers="10.34.3.11:1121 10.34.3.12:1121 10.34.3.13:1121"

Dget.timeout is for timeout for reading from cache
Dservers is list of memcached servers.

Tips for installing memcached on Centos 5.5

1-) Get the below memcached rpm file



wget ftp://ftp.pbone.net/mirror/ftp.pramberger.at/systems/linux/contrib/rhel5/i386/memcached-1.4.5-2.el5.pp.i386.rpm


My architecture was i386 you find yours an get the appropriate one.


2-) rpm -ihv memcached-1.4.5-2.el5.pp.i386.rpm

There you go....

14 Aralık 2010 Salı

Performance of inserting data on Jboss Cache compared to ConcurrentHashMap.

You can use the below Beanshell script directly with the Jboss Cache 3.2.5 GUI Demo with embedded Beanshell. Just copy the below code piece to a file eg trial.bsh. And to run it from the embedded Beanshell of JBoss Cache GUI Demo do -> source("trial.bsh"). You should see some tps and performance related text output. The below beanshell script file is for ENTRY_COUNT = 120000; You can play with different number of entries. Also do not forget to play with Xms arguments not to get a stack overflow for large entries.


import org.jboss.cache.Fqn;
import java.util.Date;
import java.io.*;
import java.util.concurrent.ConcurrentHashMap;
import java.lang.instrument.*;



public class LocationDataObject implements Externalizable {

private static final long serialVersionUID = 3383966388917008053L;

private Double xCoord;
private Double yCoord;
private Date timeStamp;

public Double getXCoord(){
return xCoord;

}
public Double getYCoord(){
return yCoord;
}
public Date getTimeStamp(){
return timeStamp;
}

public void setXCoord(Double xCoord){
this.xCoord =xCoord ;

}
public void setYCoord(Double yCoord){
this.yCoord = yCoord;
}
public void setTimeStamp(Date timeStamp){
this.timeStamp = timeStamp;
}


public void readExternal(ObjectInput in)
throws IOException,ClassNotFoundException {

xCoord = in.readDouble();
yCoord = in.readDouble();
timeStamp = (Date)in.readObject();
}


public void writeExternal(ObjectOutput out)
throws IOException {
out.writeDouble(xCoord);
out.writeDouble(yCoord);
out.writeObject(timeStamp);
}

};



public class TestJbossCachevsHasmap{
private static final Long ENTRY_COUNT = 120000;

private static Instrumentation instrumentation;

public static void premain(String args, Instrumentation inst) {
instrumentation = inst;
}


public static void main(String [] args){

childFqn1 = Fqn.fromString("/Msisdn");

tm = cache.getTransactionManager();

tm.begin();
msisdnMap = root.addChild(childFqn1);
tm.commit();

locData = new LocationDataObject();

Long startTime = System.currentTimeMillis ();

for(Long fakeMsIsdn=0; fakeMsIsdn < ENTRY_COUNT; fakeMsIsdn++){

locData.setXCoord(10.00D);
locData.setYCoord(20.00D);
Date dateNow = new Date();
locData.setTimeStamp(dateNow);

tm.begin();
msisdnMap.put(fakeMsIsdn.toString(), locData);
tm.commit();

}

Long endTime = System.currentTimeMillis ();

Long executionTime = (endTime - startTime);

double tps = (ENTRY_COUNT*1000)/executionTime;

System.out.println("Duration of Jboss Cache for inserting " + ENTRY_COUNT + " entries in " + executionTime + " msec. Average tps is " +tps );

//System.out.println(ObjectSizeFetcher.getObjectSize(root));

concurrentMap = new ConcurrentHashMap ();

Long startTime2 = System.currentTimeMillis ();

for(Long fakeMsIsdn=0; fakeMsIsdn < ENTRY_COUNT; fakeMsIsdn++){


locData.setXCoord(10.00D);
locData.setYCoord(20.00D);
Date dateNow = new Date();
locData.setTimeStamp(dateNow);
concurrentMap.put(fakeMsIsdn.toString(), locData);

}

Long endTime2 = System.currentTimeMillis ();

Long executionTime2 = endTime2 - startTime2;

double tps2 = (ENTRY_COUNT*1000)/executionTime2;

System.out.println("Duration of ConcurrentHashMap for inserting " + ENTRY_COUNT + " entries in " + executionTime2 + " msec. Average tps is " +tps2 );


}

public static long getObjectSize(Object o) {
return instrumentation.getObjectSize(o);
}



}


TestJbossCachevsHasmap.main(null);

28 Eylül 2010 Salı

Integrated Mockito TestNG and Emma on an ant built (Targeted for Java developers who aim TDD)

I have prepared a very simple sample ant project showing how to integrate and run Mockito, TestNG and Emma projects. For those of you who wonder what these projects are, Mockito is Java Mock framework, TestNG is Unit Test Case Framework and Emma is the code test coverage tool.

The purpose of this project is to demonstrate how ClassThatOutputsHelloMessage is tested with TestNG. How NameProviderInterface is mocked with Mockito and how is the Mockito syntax written in TestNG test case. Showing dependency injection = Giving NameProviderInterface as ClassThatOutputsHelloMessage constructor argument. And what kind of code coverage report you obtain with emma.

Inorder to see the sample checkout my experimental stuff rep with hg(mercurial client)

hg clone https://barisergunexperimentalstuff.googlecode.com/hg/ barisergunexperimentalstuff

And than, navigate to testngmockitosimplesample folder. And run the below ant command:

ant runtestswithemma

If everything is okey you will see a generated folder named coverage. Inside this directory you will see 3 different formats of report(html,xml,txt). Especially check the html format to easily navigate the report to see which lines of code is not tested.

At the begining you will see that not all of the code is tested. In the ClassThatOutputsHelloMessageTest.java source Uncomment the commented-out test case. And re-run tests with emma. Check and verify that the coverage report has changed with %100 coverage.

Please place a comment here on this blog if you face any problems.

Things to remember:

Make sure your environment variable JAVA_HOME is set. When you try to integrate these tools to your own project keep in mind that you should compile your sources with debug="on" option in order to be able to see the line coverage of your codes with emma instrumentation.


Blogs that helped me:
http://rwatsh.blogspot.com/2008/03/using-testng-with-emma-for-automating.html

24 Eylül 2010 Cuma

Imagine how Test Driven Development helps you.

I decided to write this article to be able to provide a general understanding about test driven development and how it may help you. When I am talking about the subject I will refer to bugs as real world bugs like a single fly or a mosquito etc. and Testing approaches as weapons used to kill them ie. smashing, electrical killers or an axe (well can you kill a mosquito with an axe, theoretically yes you can; yet we will see the results ;)) I will number these testing approaches from 1 to 3 and finally return back to the real world mosquito scenario.

1-) Unit Testing with Mock APIs: (SMASH)

Lets start with unit testing and mock APIs around them. When we need to talk about unit testing keep in mind there other concepts such as Fake APIs as well. Well to give you an understanding about the difference of these 2 approaches:

a-) For Mock API you can use mock frameworks such as (Mockito for Java Google Mock for C++) and all you will have to do is write the behavior of your mocked API during your Unit Test cases as pre-compiled expectations and you will spend no or minimum (minimum for Adapting API implementations) effort on implementing the mocked api. Because mock framework does the implementation for you, or better to say; mock framework provides you an API as if there is a real API interacting with your unit or module under development.

b-) Fake API approach is much more time consuming approach for a developer. Because based on your real API you have to implement a "Fake API" which behaves like your real API and you will have to provide switch like structures to manipulate the behavior of the implemented fake api.

Well considering from development point of view; mock approach seems much more reasonable compared to the fake approach. But you have to be careful about the design of your software module if you follow the mock approach. The mock approach requires you to provide a visibility for 3rd party APIs from out of your implemented module. Here the Dependency Injection is the remedy for such design considerations which I will discuss about on another article.


Pros:
- Isolation of development modules completely from each other during design.
- Unit tests for a module run in seconds or mseconds.
- You can find simple bugs early in the development cycle.
- Maintenance and refactoring efforts of the software module is significantly reduced.
- The developer is more confident about his software modules.
- You do not need a test bed.
- The unit tests can be %100 percent automated.

Cons:
- Development effort of software developer may have been doubled during the initial development. (considering reduction of maintenance efforts in later cycles this negative affect is balanced in whole life cycle of the software module).
- Another designing requirement of the software module rises such as "Design From Test Point Of View".


2-) Integration Testing:(ELECTRICAL BURNER FOR KILLING INSECTS)

Now that each software module has been developed in an isolated kind of approach, it is time to bring them together or all together and see the affects of them on each other. There are some commercially available tools beside home made integration test frameworks which software developers can use for such purposes. You can think of this stage as replacing the mock apis (during unit tests) with real implemented modules by other developers. Generally an integrator such as the technical leader of the project drives this effort. I have an important suggestion here: INTEGRATE OFTEN such as once an every week or every two weeks. Because other wise it will be hard to keep in line regarding the major changes during development.

Pros:
- Gives you the ability to trace undesired affects of the integrated modules on each other, before releasing an alpha version of your product.
- Good approach for locating hard to find bugs.
- Confirms that specifications and protocols are doing fine in coordination.

Cons:
- Takes more time than a unit testing such as minutes or more depending on your environment
- Requires you to provide a real test bed with real software and hardware units.
- Harder to automate compared to Unit Tests because there may be many dependencies from outer world such as (Dbs, Network conditions etc....)


3-) Black Box Testing: (AXE)

This is actually a very large topic including sub headers such as Regression, Functional, Stress etc. tests . So from a developer point of view this effort is driven by other people, tools or organizations such as Manual and/or Automated Test Engineers, 3rd party testing organizations, Black Box Automated Test Tools etc. So I will leave you the googling part and just mention about main Pros and Cons.

Pros:
- Overall software product is tested by professional testers in alpha and/or beta cycles before releasing the product to the market.
- In this cycle inconsistencies are diminished from a user point of view.

Cons:
- You really need a good test bed, test setup, automated test environment for black box testing.
- Takes days or even months to test the overall product.

KILLING THE MOSQUITO SCENARIO

- You use the AXE to smash the mosquito after many tries you are very tired, the walls are ruined and the mosquito is still alive and kicking.
- Than you decide to use the ELECTRICAL BURNER, you wait for the sun to settle down and the room is dark you plugin the device and wait for the mosquito, but the mosquito found another light source and made his way along that direction and he never hit the electrical burner. You are very tired and sleepless, the mosquito is still alive and buzzy.
- Than next morning you decide to smash the mosquito on the wall, you are very tired so after many attempts you finally managed to get a bloody spot on the wall.

That mosquito was a NULL POINTER that some how was never hit during your Integration or Black Box tests.


Have a nice day;

Baris ERGUN.

19 Eylül 2010 Pazar

Architechtural Topic: Webinar about Commonality And Variability Analysis.

I have been following this Pattern Oriented Methodology called Commonality And Variability Analysis CVA since last 4 years. I performed CVA just right after Use Case Analysis where a general Use Case of the Software Project were clear. Well it is hard to find Product Managers writing Use Cases so I also had to spend some time on this Use Case Analysis thing based on some valuable information from IBM Rational approaches.

The cycle goes like this. Use Case Analysis -> CVA (High Level Architectural Design efforts)-> Test Driven Implementation -> More Detailed Architectural decisions on the Fly -> Review of CVA before Integration-> write Integration Tests -> Deliver.

Thanks to Net Objectives which makes this webinar available online about CVA I hope you enjoy it.

http://www.netobjectives.com/streamzines/Commonality-VariabilityAnalysis/player.html

For further reading I suggest Copliens Thesis and Design Patterns Explained.

14 Eylül 2010 Salı

Switch to "NoMachine's" Linux to Linux fast Remote Desktop products (VNC or ssh -X sucks)

- Download the appropriate pacakge for your Linux distro from this link.

- Install Client, Node and Server respectively to your host machine
- Install Client to your remote machine.
- /usr/NX/bin/nxclient and have fun....

Note: Make sure that your sshd is up and running (prerequisite)

7 Eylül 2010 Salı

Quick guide for using Google Code Repository and Mercurial as the Source Control Tool

1-) Click to this link Code Google and choose Create a New Project Link.

2-) According to directions from Google on the page you read; create your code project and choose Mercurial as your source control tool.

3-) I am using Ubuntu 10.04. I used 'sudo apt-get install tortoisehg' routine to install the hg executable as Mercurial source control client.

4-) hg clone {your_google_code_repository_url}

5-) make the necessary changes in the checkedout directory

6-) hg add {directories or files you desire}

7-) hg commit -u {your_google_account} -m "your commit message"

Remember that commit is a local operation on Mercurial

8-) hg push

In order to sync the changes to your google code repository. When you call this routine you will be asked your google code credentials.

And thats it. Just a very fast tutorial...

19 Ağustos 2010 Perşembe

EclEmma Test Code Coverage Tool

I have been using TestNG test framework with Mockito Mock Framework on my Eclipse IDE. I strongly advise those of you who are using a similar set of tools for Unit Tests to have a look at Code coverage tool EclEmma. With EclEmma you can run your TestNG tests and see the test coverage to discover missing scenarios. It has other features as well http://www.eclemma.org/

11 Ağustos 2010 Çarşamba

switching Iphone to Android experience.

http://designbygravity.wordpress.com/2010/08/10/iphone-to-droid-x-lots-more-good-than-bad

13 Mayıs 2010 Perşembe

special computer glasses reduced my headaches!

Staring on my laptops or desktops screen without rest caused me a lot of headaches in my previous assignments. It has been 6 months I own a pair of Special computer glasses. The headache is gone for sure, I heavily recommend these glasses. Gunnar Optics

Usage Example of Boost Graph adjacency_list for Combining some Actions.

The below piece of code is just a quick sample reference code produced by me on Usage of Boosts Graph library (QT Console Application). The motivation is to combine together of actions together on a graph and proceed to next Action according to the current running Actions result.



// main.cpp

#include "qtcore/qcoreapplication"
#include "qtglobal"
#include "qstring"
#include "qlist"
#include "qdebug"

#include "boost/utility.hpp"
#include "boost/property_map.hpp"
#include "boost/graph/adjacency_list.hpp"


#include "Deneme1Class.h"
#include "Deneme2Class.h"
#include "Deneme3Class.h"


using namespace boost;


enum vertex_trobject_t{ vertex_trobject};
enum edge_retresult_t { edge_retresult };

namespace boost {
BOOST_INSTALL_PROPERTY(vertex, trobject);
BOOST_INSTALL_PROPERTY(edge, retresult);
}

typedef property VertexProperty;
typedef property EdgeProperty;
typedef adjacency_list TrActionGraph;
typedef property_map::type TrActionPropType;

typedef std::pair Edge;
typedef graph_traits::out_edge_iterator OutEdgeIterator;


// The below function is not exactly doing what I want it to do.
// create a tag for our new property
template
void travel(EdgeIter first, EdgeIter last, const TrActionGraph& G)
{
typedef typename property_map::const_type TrActionMap;
typedef typename boost::property_traits::value_type TrActionClass;
typedef typename property_map::const_type EdgeRetTypeMap;

// Access the propety acessor type for this graph
TrActionMap trmap = get(vertex_trobject, G);
EdgeRetTypeMap edgeMap = get(edge_retresult_t(), G);

TrActionClass src_trtype, targ_trtype;
src_trtype = boost::get(trmap, source(*first, G));
src_trtype->run();

while (first != last) {
src_trtype = boost::get(trmap, source(*first, G));
targ_trtype = boost::get(trmap, target(*first, G));
int retValProp = boost::get(edgeMap,*first);
if(src_trtype->getRetVal() == retValProp){
targ_trtype->run();
}
qDebug() <<>getTrId() << " is connected to "
<<>getTrId();
++first;
}

}

template
void travel2(VertexIter first, VertexIter last, const TrActionGraph& G)
{
typedef typename property_map::const_type TrActionMap;
typedef typename boost::property_traits::value_type TrActionClass;
typedef typename property_map::const_type EdgeRetTypeMap;

typedef std::pair OutEdgePair;

// Access the propety acessor type for this graph
TrActionMap trmap = get(vertex_trobject, G);
EdgeRetTypeMap edgeMap = get(edge_retresult_t(), G);

QList listOfStartingPoints;

// Get the list of starting points(vertices) of Graph store them in QList
while (first != last) {
if(in_degree(*first,G)==0){
listOfStartingPoints.push_back(first);
}
++first;
}

TrActionClass curActionObj;

QListIterator ourIter(listOfStartingPoints);
// Iterate all the starting vertexes of Graph
while(ourIter.hasNext()){
VertexIter curIter =ourIter.next();
// Do not iterate further if there is no more out edges That is to say the vertex in subject is the last vertex
while(out_degree(*curIter,G)!=0){
curActionObj = boost::get(trmap, *curIter);
curActionObj->run();
int actionResult = curActionObj->getRetVal();

// Get the Iterator of Out Edges
bool nextVerseFound = false;

OutEdgePair edgePair = out_edges(*curIter,G);

// Iterate through out edges to find the Edge Retval Property equals to Retval by Action Object
while(edgePair.first != edgePair.second){
int retValProp = boost::get(edgeMap,*edgePair.first);
if(retValProp == actionResult){
curIter = target(*edgePair.first, G);
nextVerseFound = true;
break;
}
++edgePair.first;
}
Q_ASSERT(nextVerseFound);
if(!nextVerseFound){
break; // should not continue any more in this situation if there are no further loops defined.
}

}
//
curActionObj = boost::get(trmap, *curIter);
curActionObj->run();
}

}

int main(int argc, char *argv[])
{
QCoreApplication a(argc, argv);

// create a typedef for the Graph type
Deneme1Class tr1(0);
Deneme2Class tr2(1);
Deneme3Class tr3(2);
Deneme1Class tr4(3);
Deneme2Class tr5(4);
Deneme3Class tr6(5);
Deneme1Class tr7(6);
Deneme2Class tr8(7);
Deneme3Class tr9(8);
Deneme1Class tr10(9);
Deneme1Class tr11(10);
Deneme1Class tr12(11);

tr9.setNumberOfMaxRetries(1);

TrActionGraph graphOfTr(3);

Edge edge1(tr1.getTrId(),tr2.getTrId());
Edge edge2(tr1.getTrId(),tr3.getTrId());
Edge edge3(tr2.getTrId(),tr3.getTrId());
Edge edge4(tr3.getTrId(),tr4.getTrId());
Edge edge5(tr3.getTrId(),tr5.getTrId());
Edge edge6(tr4.getTrId(),tr6.getTrId());
Edge edge7(tr4.getTrId(),tr7.getTrId());
Edge edge8(tr6.getTrId(),tr8.getTrId());
Edge edge9(tr7.getTrId(),tr9.getTrId());
Edge edge10(tr7.getTrId(),tr10.getTrId());

Edge edge11(tr9.getTrId(),tr9.getTrId()); // Setting vertice loop to it self
Edge edge12(tr9.getTrId(),tr11.getTrId());
Edge edge13(tr9.getTrId(),tr12.getTrId());

add_edge(edge1.first, edge1.second,EdgeProperty(0), graphOfTr);
add_edge(edge2.first, edge2.second,EdgeProperty(1), graphOfTr);
add_edge(edge3.first, edge3.second,EdgeProperty(0), graphOfTr);
add_edge(edge4.first, edge4.second,EdgeProperty(0), graphOfTr);
add_edge(edge5.first, edge5.second,EdgeProperty(1), graphOfTr);
add_edge(edge6.first, edge6.second,EdgeProperty(0), graphOfTr);
add_edge(edge7.first, edge7.second,EdgeProperty(1), graphOfTr);
add_edge(edge8.first, edge8.second,EdgeProperty(0), graphOfTr);
add_edge(edge9.first, edge9.second,EdgeProperty(1), graphOfTr);
add_edge(edge10.first, edge10.second,EdgeProperty(0), graphOfTr);
add_edge(edge11.first, edge11.second,EdgeProperty(0), graphOfTr);
add_edge(edge12.first, edge12.second,EdgeProperty(2), graphOfTr);
add_edge(edge13.first, edge13.second,EdgeProperty(1), graphOfTr);

TrActionPropType trTypeMap = get(vertex_trobject_t(), graphOfTr);

boost::put(trTypeMap, tr1.getTrId(), &tr1);
boost::put(trTypeMap, tr2.getTrId(), &tr2);
boost::put(trTypeMap, tr3.getTrId(), &tr3);
boost::put(trTypeMap, tr4.getTrId(), &tr4);
boost::put(trTypeMap, tr5.getTrId(), &tr5);
boost::put(trTypeMap, tr6.getTrId(), &tr6);
boost::put(trTypeMap, tr7.getTrId(), &tr7);
boost::put(trTypeMap, tr8.getTrId(), &tr8);
boost::put(trTypeMap, tr9.getTrId(), &tr9);
boost::put(trTypeMap, tr10.getTrId(), &tr10);
boost::put(trTypeMap, tr11.getTrId(), &tr11);
boost::put(trTypeMap, tr12.getTrId(), &tr12);


//travel(edges(graphOfTr).first, edges(graphOfTr).second, graphOfTr);
travel2(vertices(graphOfTr).first, vertices(graphOfTr).second, graphOfTr);

return a.exec();
}
// end of main.cpp

// IDenemeInterface.h
#pragma once

#include

class IDenemeInterface
{
public:
IDenemeInterface();
virtual ~IDenemeInterface();

virtual void run()= 0;
virtual int getRetVal()= 0;
virtual int getTrId()= 0;
virtual QString getTrName()=0;

inline void setNumberOfMaxRetries(int retryCount){m_MaxRetryCount = retryCount;};
inline int getNumberOfMaxRetries()const{return m_MaxRetryCount;};

protected:
// Static property
int m_MaxRetryCount;

// Run Time property
int m_RetVal;
int m_CurrentRetryCount;

};

// endof IDenemeInterface.h

// IDenemeInterface.cpp

#include "IDenemeInterface.h"

IDenemeInterface::IDenemeInterface(): m_MaxRetryCount(0), m_CurrentRetryCount(0)
{
}

IDenemeInterface::~IDenemeInterface()
{
}

// end of IDenemeInterface.cpp

// Deneme1Class.h

#pragma once
#include "idenemeinterface.h"

class Deneme1Class :
public IDenemeInterface
{
public:
Deneme1Class(int trid);
virtual ~Deneme1Class(void);

void run();
int getRetVal();
int getTrId();
QString getTrName();

private:

int m_trid;
QString m_trname;
};

// endof Deneme1Class.h

// Deneme1Class.cpp
#include

#include "Deneme1Class.h"

Deneme1Class::Deneme1Class(int trid):m_trid(trid), m_trname("Deneme1 Tr")
{
}

Deneme1Class::~Deneme1Class(void)
{
}

void Deneme1Class::run(){
qDebug() << "Now at Deneme1Class run at obj id:" <<>
m_RetVal = 1;
}

int Deneme1Class::getRetVal(){
return m_RetVal;
}

int Deneme1Class::getTrId(){
return m_trid;
}

QString Deneme1Class::getTrName(){
return m_trname;
}

// endof Deneme1Class.cpp

// Deneme2Class.h
#pragma once
#include "idenemeinterface.h"

class Deneme2Class :
public IDenemeInterface
{
public:
Deneme2Class(int trid);
virtual ~Deneme2Class(void);

void run();
int getRetVal();
int getTrId();
QString getTrName();

private:
int m_trid;
QString m_trname;
};

// endof Deneme2Class.h

// Deneme2Class.cpp
#include

#include "Deneme2Class.h"

Deneme2Class::Deneme2Class(int trid):m_trid(trid), m_trname("Deneme2 Tr")
{
}

Deneme2Class::~Deneme2Class(void)
{
}

void Deneme2Class::run(){
qDebug() << "Now at Deneme2Class run at obj id:" <<>
m_RetVal = 0;

}

int Deneme2Class::getRetVal(){
return m_RetVal;
}


int Deneme2Class::getTrId(){
return m_trid;
}
QString Deneme2Class::getTrName(){
return m_trname;
}

// endof Deneme2Class.cpp

// Deneme3Class.h
#pragma once
#include "idenemeinterface.h"

class Deneme3Class :
public IDenemeInterface
{
public:
Deneme3Class(int trid);
virtual ~Deneme3Class(void);

void run();
int getRetVal();
int getTrId();
QString getTrName();


private:
int m_trid;
QString m_trname;
};



// endof Deneme3Class.h

// Deneme3Class.cpp

#include

#include "Deneme3Class.h"

Deneme3Class::Deneme3Class(int trid):m_trid(trid), m_trname("Deneme3 Tr")
{
}

Deneme3Class::~Deneme3Class()
{
}

void Deneme3Class::run(){
if(m_CurrentRetryCount++ > m_MaxRetryCount){
// no need to run anymore
m_RetVal = 1;
}else{
qDebug() << "Now at Deneme3Class run at obj id:" <<>
m_RetVal = 0;
}
}

int Deneme3Class::getRetVal(){
return m_RetVal;
}

int Deneme3Class::getTrId(){
return m_trid;
}

QString Deneme3Class::getTrName(){
return m_trname;
}


// endof Deneme3Class.cpp

30 Mart 2010 Salı

Praise to googletest and googlemock Unit Test frameworks.

Below text is not reducted for grammer.

Rationale:

In todays Software Development efforts if you are developing a long term software project it is essential that you give HIGH importance to Test Driven Development (TDD) approach. I cannot think of any Software Engineer that can develop and maintain the modules that he is responsible without writing white-box tests such as Unit and Integration Tests.

Software Engineers who resists to TDD approach are complaining mainly about tight software schedules and yet they dont really know the value of re-running tests before commiting changes to modules that they are responsible for.

Imagine in a tight schedule environment you are forced to add a new feature to the functionality you have implemented before. How would you be confident about the new implementation without re-testing the old functionality and make sure that nothing has broken. Etc...

Anyway I will mention about how googletest and googlemock eased my TDD approaches. Before meeting these testing frameworks I was using cppunit and my Fake Object implementations.

Drawbacks and struggles of cppunit

- You have to declare your test functions in .h implement them in .cpp and register them to the framework.

- I do not know of any mock framework that goes well with this test framework.

Drawbacks and struggles of Fake Object approach.

- I had to implement fake classes that were implementing some interfaces that real objects have implemented.

- I had to implement function switches in order to set some expectations before running my test code.

- I was ignoring to test positive test cases since implementation of the fake objects were becoming to complex for me. Because, I had to fill in all the structures required in my impementation such as when returning a structure from with in the function or as output argument of the function. So I was not implementing my Fake Objects to handle the positive cases and I was leaving this to the my Integration Tests.

- Because of my above approach my TDD was as following: First I was writing Integration Tests as I was developing, and finally I was writing Unit Tests to test negative cases at the end.


Why googletest and googlemock?

After all my time consuming TDD approaches I was looking for a better solution. When I first reserved some time for testing Googles Test Frameworks, I have observed the below benefits in order to decide contiuning developing with them.

- Google Test framework Test Functions require no declaration in header file, and neither require registeration of those functions.

- The console output legend is easy to understand with coloured texts etc..

- The Google Mock framework is just simply precompilation of expectations so I don't have to implement any Fake classes. You just declare the MOCK macros of your member functions that you are goint to mock. Below is an example

class MockWifiServiceApi : public WifiServiceImplementor{
public:
MockWifiServiceApi(ServiceManipulationInterfaceSharedPtr serviceApi):WifiServiceImplementor(serviceApi){};

MOCK_METHOD3(startService, ServiceReturnCodes(const QString& serviceName,
int timeOut,
bool * isStarted));
MOCK_METHOD3(stopService, ServiceReturnCodes(const QString& serviceName,
int timeOut,
bool * isStopped));
MOCK_METHOD2(isServiceExists, ServiceReturnCodes(const QString& serviceName,
bool * exists));
MOCK_METHOD2(isServiceRunning, ServiceReturnCodes(const QString& serviceName,
bool * running));
MOCK_METHOD1(locateService, ServiceReturnCodes(QString * serviceName));


};

- They also provide phyton scripts in order you to generate above header declaration automatically. Which is very nice.

- You can mock output arguments of a function in a handy way. And also provide you enough tricks for diffucult mocking problems such as arguments that dont have copy constructor or assignment operator defined.

- Finally it is very easy to learn and have a very good online wiki source.

Conclusion:

Googles Test Frameworks increased my productivity and helped me very much on enhancing the quality of my codes. Thanks to the Google Team for their significant contribution on open source development.







29 Mart 2010 Pazartesi

Hello world.