Repository: NCIC-PARALLEL/Graphine-SDK
Branch: master
Commit: 89a5a562228d
Files: 40
Total size: 91.9 KB
Directory structure:
gitextract_44_kfi19/
├── .gitignore
├── PAK.Rproj
├── README.md
├── analysis_module/
│ ├── appinfo/
│ │ ├── analysis.sh
│ │ ├── appinfo.R
│ │ └── featureinfo.xml
│ ├── envinfo/
│ │ ├── EnvGather.R
│ │ ├── analysis.sh
│ │ └── featureinfo.xml
│ ├── paktimer/
│ │ ├── Makefile
│ │ ├── analysis.sh
│ │ ├── featureinfo.xml
│ │ ├── paktimer
│ │ └── paktimer.c
│ └── tau/
│ ├── analysis.sh
│ ├── featureinfo.xml
│ ├── featureofPAPI.r
│ └── outputformat.R
├── applications/
│ ├── multiplyexample.c
│ ├── optimized.cpp
│ └── result.xml
├── framework/
│ ├── DBModule/
│ │ └── functions.R
│ ├── EvaluatorModule/
│ │ └── Evaluator.R
│ ├── ExtractorModule/
│ │ ├── Analyze.old
│ │ └── Extractor.R
│ ├── Interface/
│ │ ├── Analyser.R
│ │ └── Generator.R
│ ├── LearnerModule/
│ │ └── Learner.R
│ ├── OptimizerModule/
│ │ └── Optimizer.R
│ ├── ProducerModule/
│ │ ├── .RData
│ │ └── Producer.R
│ ├── Tuning/
│ │ └── Tuner.R
│ ├── dependencies.R
│ └── lib/
│ ├── OptimizationSpace.R
│ ├── learners.R
│ └── producers.R
├── generator_module/
│ └── optimizeCompilerFlag/
│ ├── transform.sh
│ └── variantinfo.xml
├── pak.R
└── tutorial/
└── autotuning_compilerflag.R
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
*.svn*
.Rproj.user
.Rhistory
================================================
FILE: PAK.Rproj
================================================
Version: 1.0
RestoreWorkspace: Default
SaveWorkspace: Default
AlwaysSaveHistory: Default
EnableCodeIndexing: Yes
UseSpacesForTab: Yes
NumSpacesForTab: 2
Encoding: UTF-8
RnwWeave: Sweave
LaTeX: pdfLaTeX
================================================
FILE: README.md
================================================
# PAK
A performance tuning and knowledge management suit
#Introduction
PAK is a general scientific application autotuning framework which can significantly decrease the work of the programmer
and improve the speed of optimising code.
We believe optimising code must be an enjoyable, creative experience. PAK attempts to take the pain out of programmers by
taking different models used in processes of optimising projects, such as extracter feature model, optimiser model.
PAK is accessible, yet powerful, providing powerful tools needed for large, robust applications.
#PAK models

##Analyser
###functional description of analyser
Analyse the feature of application instance and the index of performance.
###Customizing
First define the configuration file which including featureinfo.xml and analysis.sh.
####featureinfo.xml
#####feature defination
including static/dynamic, feature name, feature description, enable environment variable(when the variable is true meaning
that the relative feature needs to be analysed), data type
feature data type including numerical, category, boolen, combination(supporting nesting)
#####examples
```
static
arrayshape
the shape of array in target program
Enable_arrayshape
numerical
numerical
numerical
```
####analysis.sh
- imput: target application, environment variable.
- output: result file, the name of result file
- example of output format:
```
arrayshape
128
128
256
```
###Initializing object
`C.Analyser(Name,Path,Features)`
- Name: the name of analysis(the file name of the configuration file)
- Path: the path of configuration file
- Feature: the features during the analysis
##Generator
###functional description of generator
based on the input parameters,optimise and change the application instance.
###Customizing
First define the configuration file which including variantinfo.xml and atransform.sh.
####variantinfo.xml
#####variable parameter defination
including the name of variant parameter, the description of features, enable environment variable(transferring
the parameter of variant), data type(the same with the data type of feature).
#####examples
```
Unrolling
the unrolling factor
ENABLE_Unrolling
numerical
```
####transform.sh
- input: the name of target application, the name of output file, environment variable.
- output: the instance of having optimised, the name of output file
###Initializing object
`C.Generator(Name,Path,Parameters)`
- Name: the name of analysis(the file name of the configuration file)
- Path: the path of configuration file
- Feature: the received parameters
##Extractor
###functional description of extractor
instance analyzer, in charge of static analyzing, environment analyzing and input analyzing of instance. Every
extractor analyze the instance in the general through including one or multiple analyser objects. The result of
analyzing could produce the parameters, predict and knoledge mining.
###Customizing
leveraging the input parameters, instantiate the customized objects.
###Initializing object
`C.Extractor$new(analysers)`
- analysers: the table object appointing the extractor of feature.
###example
```
# create a list that contains hpsFrontend-flop_intensity
analyser.hpsFrontend<-list(hpsFrontend=c("flop_intensity"))
#init an extractor object using previous list
myextractor<-C.Extractor$new(list(hpsFrontend=analyser.hpsFrontend))
```
##Producer
###functional description of producer
the base class of producer, defining the interface method of instantiating producer. producer optimise the
process of producing parameters, and by using the result of analyzing instance and the evaluation of last time
implement various complex algorithm including heuristic seaching method, exhaustive searching method and model
predicting method.
###Customizing
customise complex producer by implementing interface method--getParameter.
###base class: `C. Producer()` :
`getParameter(step,extractor.result,score)`:
- step: current interation step
- extractor.result: running instance anf analyzing features
- score: the score of parameter in the last time
###example:
```
# an exhaustion search producer
C.Producer.Exhaustion<-setRefClass(
"C.Producer.Exhaustion",
contains="C.Producer",
fields = list(parameter.space="data.frame"),
methods = list(
#Init function
initialize=function(parameter.space){
parameter.space<<-parameter.space
},
#Implemente the interface method
getParameter=function(step,extractor.result,score)
{
if(stepparameter.number)
return (data.frame())
v.score<<-c()
v.pos<<-0
}
new.parameter<-local.optimal
new.parameter[[v.idx]]<-parameter.range[[v.idx]][v.pos+1]
return (new.parameter)
}
)
)
```
##Optimizer
###functional description of optimiser
optimise the instance including code change, generating optimising variant, environment set. Every optimiser only
contain a generator for optimising variant currently.
###Customizing
instantiating customised object by using the input parameter.
###Initializing object
`C.Optimizer(generator.name,output.name)`
- generator.name: optimising variant generator
- output.name: the output of optimising variant, default: "optimized.cpp"
###example
```
#init an optimizer use hpsGen which is a code generator for stencil
myoptimizer<-C.Optimizer$new(generator.name="hpsGen")
```
##Evaluator
###functional description of evaluator
evaluate the running instance during the process of autotuning, including obtaining the performance index,
score index and so on. Every evaluator include one or mutiple analyser objects for evaluating the optimising
variant index. Every evaluating index need to associate one evaluation function. When the index is enough,
return 0, otherwise return a negative number. Absolute number mean the distance to requirement. The evalutor
finally return a tatal score--sum of all evaluation index. When then total score is 0, autotuning is convergent.
###Customizing
instantiating customised object by using the input parameter.
###Initializing object
`C.Evaluator(sub.evaluators)`
- sub.evaluators: include multiple evaluators. the index is the name of evaluator. the value is the table of evaluator
function.
###example
```
# create a sub.evaluaor, which is list of feautres to evluate functions
sub.evaluator.tau<-list(P_WALL_CLOCK_TIME=function(x){if(x>100) return (100-x) else return(0)})
#init a C.Evaluator object
myevaluator<-C.Evaluator$new(sub.evaluators=list(tau=sub.evaluator.tau))
```
##example: implementing a full tuning
```
# create a tuner
mytuner<-C.Tuner$new(app=app,optimizer=myoptimizer,evaluator=myevaluator,producer =myproducer,need.store=TRUE)
# perform tuning
mytuner$tune()
# output best parameters
print(mytuner$best.parameters)
```
================================================
FILE: analysis_module/appinfo/analysis.sh
================================================
#!/bin/bash
source /home/lyl/.bashrc
Rscript $(cd "$(dirname "$0")"; pwd)/appinfo.R $1
================================================
FILE: analysis_module/appinfo/appinfo.R
================================================
library("XML")
bashrc<-"/home/lyl/.bashrc"
analysername<-"appinfo"
appinfo.list<-list()
args<-commandArgs(T)
appname<-args[1]
appinfo.list$MD5<-gsub(" .*","",system(paste0("md5sum ",appname),intern = TRUE))
doc = newXMLDoc()
fsnode<-newXMLNode(name="features",doc=doc)
for(i in 1:length(appinfo.list))
{
fnode<-newXMLNode(name = "feature",parent = fsnode)
addChildren(fnode,
newXMLNode(name="name",names(appinfo.list[i])),
newXMLNode(name="value",appinfo.list[[i]])
)
}
rfilename<-paste0(Sys.time(),analysername,"anaylsisresult.xml")
rfilename<-sub(":","",rfilename)
rfilename<-sub("-","",rfilename)
rfilename<-sub(" ","",rfilename)
output<-saveXML(doc,file=rfilename,prefix = sprintf(" ",analysername))
cat(output)
================================================
FILE: analysis_module/appinfo/featureinfo.xml
================================================
MD5
category
static
calculates and verifies 128-bit MD5 hashes, as described in RFC 1321
TRUE
ENABLE_MD5
================================================
FILE: analysis_module/envinfo/EnvGather.R
================================================
library("XML")
bashrc<-"/home/lyl/.bashrc"
analysername<-"EnvGather"
envlist<-list()
cpuinfo<-system("lscpu",intern = TRUE)
meminfo<-system("cat /proc/meminfo",intern = TRUE)
info<-c(cpuinfo,meminfo)
for(i in info)
{
tmp<-unlist(strsplit(i,": *"))
if(tmp[1]=="Architecture")
envlist[tmp[1]]<-as.character(tmp[2])
if(tmp[1]=="CPUs")
envlist[tmp[1]]<-as.numeric(tmp[2])
if(tmp[1]=="CPU MHz")
envlist[sub(" ","_",tmp[1])]<-as.numeric(tmp[2])
if(tmp[1]=="Threads per core")
envlist[sub(" ","_",tmp[1])]<-as.numeric(tmp[2])
if(tmp[1]=="Cores per socket")
envlist[sub(" ","_",tmp[1])]<-as.numeric(tmp[2])
if(tmp[1]=="Byte Order")
envlist[sub(" ","_",tmp[1])]<-as.character(tmp[2])
if(tmp[1]=="Sockets")
envlist[sub(" ","_",tmp[1])]<-as.numeric(tmp[2])
if(tmp[1]=="NUMA nodes")
envlist[sub(" ","_",tmp[1])]<-as.numeric(tmp[2])
if(tmp[1]=="L1d cache")
envlist["L1d_cache_K"]<-as.numeric(sub("K","",tmp[2]))
if(tmp[1]=="L1i cache")
envlist["L1i_cache_K"]<-as.numeric(sub("K","",tmp[2]))
if(tmp[1]=="L2 cache")
envlist["L2_cache_K"]<-as.numeric(sub("K","",tmp[2]))
if(tmp[1]=="L3 cache")
envlist["L3_cache_K"]<-as.numeric(sub("K","",tmp[2]))
if(tmp[1]=="MemTotal")
envlist["MemTotal_K"]<-as.numeric(sub("kB","",tmp[2]))
}
envlist["OS_version"]<-system("head -n 1 /etc/issue",intern = TRUE)
envlist["gcc_version"]<-system(sprintf("source %s; gcc -dumpversion;",bashrc),intern = TRUE)
envlist["icc_version"]<-system(sprintf("source %s; icc -dumpversion;",bashrc),intern = TRUE)
envlist["nvcc_version"]<-gsub(".*release *|,.*","",system(sprintf("source %s; nvcc --version | grep release",bashrc),intern = TRUE))
doc = newXMLDoc()
fsnode<-newXMLNode(name="features",doc=doc)
for(i in 1:length(envlist))
{
fnode<-newXMLNode(name = "feature",parent = fsnode)
addChildren(fnode,
newXMLNode(name="name",names(envlist[i])),
newXMLNode(name="value",envlist[[i]])
)
}
rfilename<-paste0(Sys.time(),analysername,"anaylsisresult.xml")
rfilename<-sub(":","",rfilename)
rfilename<-sub("-","",rfilename)
rfilename<-sub(" ","",rfilename)
output<-saveXML(doc,file=rfilename,prefix = sprintf(" ",analysername))
cat(output)
#
# featureinfo_doc = newXMLDoc()
# fsnode<-newXMLNode(name="features",doc=featureinfo_doc)
# for(i in 1:length(envlist))
# {
# fnode<-newXMLNode(name = "feature",parent = fsnode)
# if(is.numeric(envlist[[i]]))
# type<-"numerical"
# else
# type<-"category"
#
# addChildren(fnode,
# newXMLNode(name="name",names(envlist[i])),
# newXMLNode(name="datatype",type),
# newXMLNode(name="type","static"),
# newXMLNode(name="description","As name shows"),
# newXMLNode(name="avail","TRUE"),
# newXMLNode(name="enable_variable",paste0("ENABLE_",names(envlist[i])))
# )
#
#
# }
#
# output<-saveXML(featureinfo_doc,file="featureinfo.xml",prefix = sprintf(" ",analysername))
================================================
FILE: analysis_module/envinfo/analysis.sh
================================================
#!/bin/bash
source /home/lyl/.bashrc
Rscript $(cd "$(dirname "$0")"; pwd)/EnvGather.R
================================================
FILE: analysis_module/envinfo/featureinfo.xml
================================================
Architecture category static As name shows TRUE ENABLE_Architecture Byte_Order category static As name shows TRUE ENABLE_Byte_Order CPU_MHz numerical static As name shows TRUE ENABLE_CPU_MHz L1d_cache_K numerical static As name shows TRUE ENABLE_L1d_cache_K L1i_cache_K numerical static As name shows TRUE ENABLE_L1i_cache_K L2_cache_K numerical static As name shows TRUE ENABLE_L2_cache_K L3_cache_K numerical static As name shows TRUE ENABLE_L3_cache_K MemTotal_K numerical static As name shows TRUE ENABLE_MemTotal_K OS_version category static As name shows TRUE ENABLE_OS_version gcc_version category static As name shows TRUE ENABLE_gcc_version icc_version category static As name shows TRUE ENABLE_icc_version nvcc_version category static As name shows TRUE ENABLE_nvcc_version
================================================
FILE: analysis_module/paktimer/Makefile
================================================
CXX=gcc
INCLUDES=-I.
CXXFLAGS=$(INCLUDES)
default:
$(CXX) paktimer.c -o paktimer
clean::
-rm -f paktimer
================================================
FILE: analysis_module/paktimer/analysis.sh
================================================
#!/bin/bash
#export ENABLE_PAKTIME=TRUE
TIMEENABLE=`env |grep ENABLE_PAKTIME | grep TRUE`
if [ $TIMEENABLE = "ENABLE_PAKTIME=TRUE" ];then
$(cd "$(dirname "$0")"; pwd)/paktimer $1 1>/dev/null 2>/dev/null
time=`cat temp.time`
rm temp.time result.xml
echo "
time
"$time"
">>result.xml
echo "result.xml"
fi
================================================
FILE: analysis_module/paktimer/featureinfo.xml
================================================
dynamic
time
the execution time of a given application
ENABLE_PAKTIME
numerical
================================================
FILE: analysis_module/paktimer/paktimer.c
================================================
#include
#include
struct timeval t1;
struct timeval t2;
int main(int argc,char** argv)
{
FILE *fp;
double time;
//printf("the target file is %s \n", argv[1]);
gettimeofday(&t1,0);
system(argv[1]);
gettimeofday(&t2,0);
time = ((((1000000.0 * (t2.tv_sec - t1.tv_sec)) + t2.tv_usec) - t1.tv_usec) / 1000000.0);
if(fp=fopen("temp.time","wb"))
fprintf(fp,"%.4f",time);
return 0;
}
================================================
FILE: analysis_module/tau/analysis.sh
================================================
#!/bin/bash
source /home/lyl/.bashrc
export PATH=/home/lyl/tools/tau2.23_icpc_pdt_papi/x86_64/bin:$PATH
export TAU_MAKEFILE=/home/lyl/tools/tau2.23_icpc_pdt_papi/x86_64/lib/Makefile.tau-icpc-papi-pdt
export TAU_THROTTLE=0
i=1
for f in `env |grep ENABLE_ | grep TRUE`
do
FNAME=${f#ENABLE_}
FNAME=${FNAME%%=*}
export COUNTER${i}=$FNAME
let i+=1
done
icpc_flag=$icpc_flag
icpc_flag=$icpc_flag
icpc_flag=$icpc_flag
icpc_flag=$icpc_flag
icpc_flag=$icpc_flag
#CC=icc
CC=tau_cxx.sh
rm MULTI__P* -rf
$CC $icpc_flag -c -vec-report2 $1 -o mid.o 2>/dev/null 1>/dev/null
$CC $icpc_flag mid.o main.cpp -o myexe 2>/dev/null 1>/dev/null
./myexe 2>/dev/null 1>/dev/null
Rscript $(cd "$(dirname "$0")"; pwd)/outputformat.R
rm *.o
rm myexe
================================================
FILE: analysis_module/tau/featureinfo.xml
================================================
P_WALL_CLOCK_TIME Time cost of target application
dynamic TRUE ENABLE_P_WALL_CLOCK_TIME numerical
PAPI_REF_CYC Reference clock cycles
dynamic TRUE ENABLE_PAPI_REF_CYC numerical
PAPI_VEC_DP Double precision vector/SIMD instructions
dynamic TRUE ENABLE_PAPI_VEC_DP numerical
PAPI_VEC_SP Single precision vector/SIMD instructions dynamic TRUE ENABLE_PAPI_VEC_SP numerical PAPI_DP_OPS Floating point operations; optimized to count scaled double precision vector operations dynamic TRUE ENABLE_PAPI_DP_OPS numerical PAPI_SP_OPS Floating point operations; optimized to count scaled single precision vector operations dynamic TRUE ENABLE_PAPI_SP_OPS numerical PAPI_FP_OPS Floating point operations dynamic TRUE ENABLE_PAPI_FP_OPS numerical PAPI_FNV_INS Floating point inverse instructions dynamic FALSE ENABLE_PAPI_FNV_INS numerical PAPI_FSQ_INS Floating point square root instructions dynamic FALSE ENABLE_PAPI_FSQ_INS numerical PAPI_FDV_INS Floating point divide instructions dynamic TRUE ENABLE_PAPI_FDV_INS numerical PAPI_FAD_INS Floating point add instructions dynamic FALSE ENABLE_PAPI_FAD_INS numerical PAPI_FML_INS Floating point multiply instructions dynamic FALSE ENABLE_PAPI_FML_INS numerical PAPI_L3_TCW Level 3 total cache writes dynamic TRUE ENABLE_PAPI_L3_TCW numerical PAPI_L2_TCW Level 2 total cache writes dynamic TRUE ENABLE_PAPI_L2_TCW numerical PAPI_L1_TCW Level 1 total cache writes dynamic FALSE ENABLE_PAPI_L1_TCW numerical PAPI_L3_TCR Level 3 total cache reads dynamic TRUE ENABLE_PAPI_L3_TCR numerical PAPI_L2_TCR Level 2 total cache reads dynamic TRUE ENABLE_PAPI_L2_TCR numerical PAPI_L1_TCR Level 1 total cache reads dynamic FALSE ENABLE_PAPI_L1_TCR numerical PAPI_L3_TCA Level 3 total cache accesses dynamic TRUE ENABLE_PAPI_L3_TCA numerical PAPI_L2_TCA Level 2 total cache accesses dynamic TRUE ENABLE_PAPI_L2_TCA numerical PAPI_L1_TCA Level 1 total cache accesses dynamic FALSE ENABLE_PAPI_L1_TCA numerical PAPI_L3_TCH Level 3 total cache hits dynamic FALSE ENABLE_PAPI_L3_TCH numerical PAPI_L2_TCH Level 2 total cache hits dynamic FALSE ENABLE_PAPI_L2_TCH numerical PAPI_L1_TCH Level 1 total cache hits dynamic FALSE ENABLE_PAPI_L1_TCH numerical PAPI_L3_ICW Level 3 instruction cache writes dynamic FALSE ENABLE_PAPI_L3_ICW numerical PAPI_L2_ICW Level 2 instruction cache writes dynamic FALSE ENABLE_PAPI_L2_ICW numerical PAPI_L1_ICW Level 1 instruction cache writes dynamic FALSE ENABLE_PAPI_L1_ICW numerical PAPI_L3_ICR Level 3 instruction cache reads dynamic TRUE ENABLE_PAPI_L3_ICR numerical PAPI_L2_ICR Level 2 instruction cache reads dynamic TRUE ENABLE_PAPI_L2_ICR numerical PAPI_L1_ICR Level 1 instruction cache reads dynamic FALSE ENABLE_PAPI_L1_ICR numerical PAPI_L3_ICA Level 3 instruction cache accesses dynamic TRUE ENABLE_PAPI_L3_ICA numerical PAPI_L2_ICA Level 2 instruction cache accesses dynamic TRUE ENABLE_PAPI_L2_ICA numerical PAPI_L1_ICA Level 1 instruction cache accesses dynamic FALSE ENABLE_PAPI_L1_ICA numerical PAPI_L3_ICH Level 3 instruction cache hits dynamic FALSE ENABLE_PAPI_L3_ICH numerical PAPI_L2_ICH Level 2 instruction cache hits dynamic TRUE ENABLE_PAPI_L2_ICH numerical PAPI_L1_ICH Level 1 instruction cache hits dynamic FALSE ENABLE_PAPI_L1_ICH numerical PAPI_L3_DCW Level 3 data cache writes dynamic TRUE ENABLE_PAPI_L3_DCW numerical PAPI_L2_DCW Level 2 data cache writes dynamic TRUE ENABLE_PAPI_L2_DCW numerical PAPI_L1_DCW Level 1 data cache writes dynamic FALSE ENABLE_PAPI_L1_DCW numerical PAPI_L3_DCR Level 3 data cache reads dynamic TRUE ENABLE_PAPI_L3_DCR numerical PAPI_L2_DCR Level 2 data cache reads dynamic TRUE ENABLE_PAPI_L2_DCR numerical PAPI_L1_DCR Level 1 data cache reads dynamic FALSE ENABLE_PAPI_L1_DCR numerical PAPI_L3_DCA Level 3 data cache accesses dynamic TRUE ENABLE_PAPI_L3_DCA numerical PAPI_L2_DCA Level 2 data cache accesses dynamic TRUE ENABLE_PAPI_L2_DCA numerical PAPI_L1_DCA Level 1 data cache accesses dynamic FALSE ENABLE_PAPI_L1_DCA numerical PAPI_L2_DCH Level 2 data cache hits dynamic TRUE ENABLE_PAPI_L2_DCH numerical PAPI_L1_DCH Level 1 data cache hits dynamic FALSE ENABLE_PAPI_L1_DCH numerical PAPI_SYC_INS Synchronization instructions completed dynamic FALSE ENABLE_PAPI_SYC_INS numerical PAPI_LST_INS Load/store instructions completed dynamic FALSE ENABLE_PAPI_LST_INS numerical PAPI_TOT_CYC Total cycles dynamic TRUE ENABLE_PAPI_TOT_CYC numerical PAPI_FP_STAL Cycles the FP unit(s) are stalled dynamic FALSE ENABLE_PAPI_FP_STAL numerical PAPI_RES_STL Cycles stalled on any resource dynamic FALSE ENABLE_PAPI_RES_STL numerical PAPI_VEC_INS Vector/SIMD instructions (could include integer) dynamic FALSE ENABLE_PAPI_VEC_INS numerical PAPI_BR_INS Branch instructions dynamic TRUE ENABLE_PAPI_BR_INS numerical PAPI_SR_INS Store instructions dynamic TRUE ENABLE_PAPI_SR_INS numerical PAPI_LD_INS Load instructions dynamic TRUE ENABLE_PAPI_LD_INS numerical PAPI_FP_INS Floating point instructions dynamic TRUE ENABLE_PAPI_FP_INS numerical PAPI_INT_INS Integer instructions dynamic FALSE ENABLE_PAPI_INT_INS numerical PAPI_TOT_INS Instructions completed dynamic TRUE ENABLE_PAPI_TOT_INS numerical PAPI_TOT_IIS Instructions issued dynamic FALSE ENABLE_PAPI_TOT_IIS numerical PAPI_FMA_INS FMA instructions completed dynamic FALSE ENABLE_PAPI_FMA_INS numerical PAPI_BR_PRC Conditional branch instructions correctly predicted dynamic TRUE ENABLE_PAPI_BR_PRC numerical PAPI_BR_MSP Conditional branch instructions mispredicted dynamic TRUE ENABLE_PAPI_BR_MSP numerical PAPI_BR_NTK Conditional branch instructions not taken dynamic TRUE ENABLE_PAPI_BR_NTK numerical PAPI_BR_TKN Conditional branch instructions taken dynamic TRUE ENABLE_PAPI_BR_TKN numerical PAPI_BR_CN Conditional branch instructions dynamic TRUE ENABLE_PAPI_BR_CN numerical PAPI_BR_UCN Unconditional branch instructions dynamic TRUE ENABLE_PAPI_BR_UCN numerical PAPI_HW_INT Hardware interrupts dynamic FALSE ENABLE_PAPI_HW_INT numerical PAPI_FUL_CCY Cycles with maximum instructions completed dynamic FALSE ENABLE_PAPI_FUL_CCY numerical PAPI_STL_CCY Cycles with no instructions completed dynamic FALSE ENABLE_PAPI_STL_CCY numerical PAPI_FUL_ICY Cycles with maximum instruction issue dynamic FALSE ENABLE_PAPI_FUL_ICY numerical PAPI_STL_ICY Cycles with no instruction issue dynamic TRUE ENABLE_PAPI_STL_ICY numerical PAPI_MEM_WCY Cycles Stalled Waiting for memory writes dynamic FALSE ENABLE_PAPI_MEM_WCY numerical PAPI_MEM_RCY Cycles Stalled Waiting for memory Reads dynamic FALSE ENABLE_PAPI_MEM_RCY numerical PAPI_MEM_SCY Cycles Stalled Waiting for memory accesses dynamic FALSE ENABLE_PAPI_MEM_SCY numerical PAPI_CSR_TOT Total store conditional instructions dynamic FALSE ENABLE_PAPI_CSR_TOT numerical PAPI_CSR_SUC Successful store conditional instructions dynamic FALSE ENABLE_PAPI_CSR_SUC numerical PAPI_CSR_FAL Failed store conditional instructions dynamic FALSE ENABLE_PAPI_CSR_FAL numerical PAPI_TLB_SD Translation lookaside buffer shootdowns dynamic FALSE ENABLE_PAPI_TLB_SD numerical PAPI_L3_DCH Level 3 data cache hits dynamic FALSE ENABLE_PAPI_L3_DCH numerical PAPI_PRF_DM Data prefetch cache misses dynamic FALSE ENABLE_PAPI_PRF_DM numerical PAPI_BTAC_M Branch target address cache misses dynamic FALSE ENABLE_PAPI_BTAC_M numerical PAPI_L2_STM Level 2 store misses dynamic TRUE ENABLE_PAPI_L2_STM numerical PAPI_L2_LDM Level 2 load misses dynamic FALSE ENABLE_PAPI_L2_LDM numerical PAPI_L1_STM Level 1 store misses dynamic TRUE ENABLE_PAPI_L1_STM numerical PAPI_L1_LDM Level 1 load misses dynamic TRUE ENABLE_PAPI_L1_LDM numerical PAPI_TLB_TL Total translation lookaside buffer misses dynamic FALSE ENABLE_PAPI_TLB_TL numerical PAPI_TLB_IM Instruction translation lookaside buffer misses dynamic TRUE ENABLE_PAPI_TLB_IM numerical PAPI_TLB_DM Data translation lookaside buffer misses dynamic TRUE ENABLE_PAPI_TLB_DM numerical PAPI_LSU_IDL Cycles load/store units are idle dynamic FALSE ENABLE_PAPI_LSU_IDL numerical PAPI_FPU_IDL Cycles floating point units are idle dynamic FALSE ENABLE_PAPI_FPU_IDL numerical PAPI_FXU_IDL Cycles integer units are idle dynamic FALSE ENABLE_PAPI_FXU_IDL numerical PAPI_BRU_IDL Cycles branch units are idle dynamic FALSE ENABLE_PAPI_BRU_IDL numerical PAPI_L3_STM Level 3 store misses dynamic FALSE ENABLE_PAPI_L3_STM numerical PAPI_L3_LDM Level 3 load misses dynamic FALSE ENABLE_PAPI_L3_LDM numerical PAPI_CA_ITV Requests for cache line intervention dynamic FALSE ENABLE_PAPI_CA_ITV numerical PAPI_CA_INV Requests for cache line invalidation dynamic FALSE ENABLE_PAPI_CA_INV numerical PAPI_CA_CLN Requests for exclusive access to clean cache line dynamic FALSE ENABLE_PAPI_CA_CLN numerical PAPI_CA_SHR Requests for exclusive access to shared cache line dynamic FALSE ENABLE_PAPI_CA_SHR numerical PAPI_CA_SNP Requests for a snoop dynamic FALSE ENABLE_PAPI_CA_SNP numerical PAPI_L3_TCM Level 3 cache misses dynamic TRUE ENABLE_PAPI_L3_TCM numerical PAPI_L2_TCM Level 2 cache misses dynamic TRUE ENABLE_PAPI_L2_TCM numerical PAPI_L1_TCM Level 1 cache misses dynamic TRUE ENABLE_PAPI_L1_TCM numerical PAPI_L3_ICM Level 3 instruction cache misses dynamic FALSE ENABLE_PAPI_L3_ICM numerical PAPI_L3_DCM Level 3 data cache misses dynamic FALSE ENABLE_PAPI_L3_DCM numerical PAPI_L2_ICM Level 2 instruction cache misses dynamic TRUE ENABLE_PAPI_L2_ICM numerical PAPI_L2_DCM Level 2 data cache misses dynamic TRUE ENABLE_PAPI_L2_DCM numerical PAPI_L1_ICM Level 1 instruction cache misses dynamic TRUE ENABLE_PAPI_L1_ICM numerical PAPI_L1_DCM Level 1 data cache misses dynamic TRUE ENABLE_PAPI_L1_DCM numerical
================================================
FILE: analysis_module/tau/featureofPAPI.r
================================================
library(XML)
papi_str<-system("source ~/.bashrc; papi_avail",intern=TRUE)
str(papi_str)
bg<-FALSE
for(oneline in papi_str)
{
tmp<-unlist(strsplit(oneline,"[ ][ ][ ]*"))
if(length(tmp)>2 && tmp[2]=="Name")
{
featuredata<<-data.frame(name=character(),avail=logical(),description=character(),enable_variable=character(),stringsAsFactors=FALSE)
bg<-TRUE
}else if(bg){
if(length(tmp)<2)
break;
part1<-unlist(strsplit(oneline,"0x"))
namestr<-part1[1]
part2<-unlist(strsplit(part1[2],"[ ][ ][ ]*"))
Avail<-part2[2]
if(Avail=="Yes")
Avail<- TRUE
else
Avail<-FALSE
featuredata<<-rbind(data.frame(name=namestr,description=part2[4],avail=Avail,enable_variable=sprintf("ENABLE_%s",namestr),stringsAsFactors=FALSE),featuredata)
}
}
featuredata$type<-"dynamic"
featuredata$datatype<-"numerical"
doc = newXMLDoc()
fsnode<-newXMLNode(name="features",doc=doc)
for(i in 1:nrow(featuredata))
{
feature<-featuredata[i,]
fnode<-newXMLNode(name = "feature",parent = fsnode)
addChildren(fnode,
newXMLNode(name="name",feature$name),
newXMLNode(name="description",feature$description),
newXMLNode(name="type",feature$type),
newXMLNode(name="avail",feature$avail),
newXMLNode(name="enable_variable",feature$enable_variable),
newXMLNode(name="datatype",feature$datatype)
)
}
saveXML(doc,file="./featureinfo.xml",prefix = " ")
================================================
FILE: analysis_module/tau/outputformat.R
================================================
library("XML")
analysername<-"tau"
metrics.vec<- system("env |grep ENABLE",intern = TRUE)
metrics.list<-list()
for(m in metrics.vec)
{
m.name<-gsub("ENABLE_||=TRUE","",m)
if(length(metrics.vec)>1)
r<-system(paste0("tail MULTI__",m.name,"/profile.0.0.0 |grep main;"),intern = TRUE)
else
r<-system(paste0("tail profile.0.0.0 |grep main;"),intern = TRUE)
r<-unlist(strsplit(r,'"'))[3]
r<-unlist(strsplit(r,' '))[5]
metrics.list[m.name]<-r
}
doc = newXMLDoc()
fsnode<-newXMLNode(name="features",doc=doc)
for(i in 1:length(metrics.list))
{
fnode<-newXMLNode(name = "feature",parent = fsnode)
addChildren(fnode,
newXMLNode(name="name",names(metrics.list[i])),
newXMLNode(name="value",metrics.list[[i]])
)
}
rfilename<-paste0(Sys.time(),analysername,"anaylsisresult.xml")
rfilename<-sub(":","",rfilename)
rfilename<-sub("-","",rfilename)
rfilename<-sub(" ","",rfilename)
output<-saveXML(doc,file=rfilename,prefix = sprintf(" \n",analysername))
cat(output)
================================================
FILE: applications/multiplyexample.c
================================================
#include
#include
#include
#define DATATYPE float
#define NX 3000
#define NY 3000
#define ITER 30
DATATYPE a[NX][NY];
DATATYPE b[NX][NY];
DATATYPE c[NX][NY];
int main()
{
int i,j,t;
for(i=0;i
time
1.7945
================================================
FILE: framework/DBModule/functions.R
================================================
OpenDB <- function() {
# Open a SQL conn if the global.conn has not been initialized,
# and save the conn to global.conn, or just return global.conn
#
# Returns:
# The conn opened or geted from the global
if(!exists("global.conn"))
global.conn <<- odbcConnect(datasource,database.user,database.pwd)
conn <- global.conn
return(conn)
}
CloseDB <- function() {
# Close the connection
#
# Args:
# conn: The conn opened by performanceDB.SQL.dbopen
if(exists("global.conn")) {
close(global.conn)
rm(global.conn)
}
}
CheckTableExistence<-function(dbname,tbname){
# check if a table 'tbname' exist in database 'dbname'
# is specificed by format
#
# Returns:
# TRUE if exist, or FALSE if not exist
cmd.str <- sprintf('show tables in %s like "%s";',
dbname, tbname)
conn <- OpenDB()
result <- sqlQuery(conn,cmd.str)
if(!is.data.frame(result))
stop(paste0("error when execute sql command in CreateTable: ",result))
if(nrow(result)==0)
return (FALSE)
else
return (TRUE)
}
CreateTable<-function(format,dbname,tbname){
# create a table 'dbname' in datable 'dbname'. The structure of table
# is specificed by dataframe 'format'
#
# Returns:
# TRUE if success, or FALSE if fail
cmd.str <- sprintf('show tables in %s like "%s";',
dbname, tbname)
conn <- OpenDB()
result <- sqlQuery(conn,cmd.str)
if(!is.data.frame(result))
stop(paste0("error when execute sql command in CreateTable: ",result))
if(nrow(result)==1)
{
print("there has existed table %s in database %s ",tbname,dbname)
return (FALSE)
}
FormatTable<-function(format)
{
fmt.str<-"id int(10) primary key not null auto_increment,"
for(i in 1:nrow(format))
{
name<-format[i,]$name
datatype<-format[i,]$datatype
if(datatype=="numerical")
fmt.str<-paste0(fmt.str,name," DOUBLE")
if(datatype=="category")
fmt.str<-paste0(fmt.str,name," VARCHAR(255)")
if(datatype=="boolen")
fmt.str<-paste0(fmt.str,name," TINYINT")
if(i!=nrow(format))
fmt.str<-paste0(fmt.str,",")
}
return (fmt.str)
}
cmd.str<-sprintf("create table %s.%s (%s);",dbname,tbname,FormatTable(format))
result <- sqlQuery(conn,cmd.str)
return(TRUE)
}
CheckAndUpdateMainTableCol<-function(subtable.names,dbname="hpts"){
# check if the main table contains colmun that connects to subtable.
# If not, alter the main table.
# Args:
# subtable.names: the names of subtable that will connecte to main table
# dbname: the name of database
#
cmd.str<-sprintf("select COLUMN_NAME from information_schema.COLUMNS where table_name = 'main' and table_schema = '%s';",dbname)
conn<-OpenDB()
result<-sqlQuery(conn,cmd.str)
maintable.names<-as.character(result[[1]])
notin.names<-subtable.names[!(subtable.names %in% maintable.names)]
if(length(notin.names)>0)
for(i in 1:length(notin.names))
{
result<-sqlQuery(conn,sprintf("alter table %s.main add %s int(10);",dbname,notin.names[i]))
}
}
CheckAndUpdateTableStructure<-function(data.names,dbname="hpts",tbname="main"){
# check if the table 'tbname' contains column that named as data.names.
# If not, alter the table.
# Args:
# data.names: the names of data that will be inserted to the table
# dbname: the name of database
# tbname: the name of table
cmd.str<-sprintf("select COLUMN_NAME from information_schema.COLUMNS where table_name = '%s' and table_schema = '%s';",tbname,dbname)
conn<-OpenDB()
result<-sqlQuery(conn,cmd.str)
table.names<-as.character(result[[1]])
not.in.names<-data.names[!(data.names %in% table.names)]
if(length(not.in.names)>0)
for(i in 1:length(not.in.names))
{
result<-sqlQuery(conn,sprintf("alter table %s.%s add %s int(10);",dbname,tbname,not.in.names[i]))
}
}
StoreAnalysis<-function(analysis.results,override=TRUE,
analysis_module.path=path.generator_tools)
{
# store the analysis result to DB.
# Args:
# analysis.results: a list that contain analysis result of mutiple analyzers
# analysis_module.path: the directory path of analyzers
#
# Returns:
# the id in main table if success, or 0 if fail
key.analyser.names<-c("appinfo","envinfo")
for(ka in key.analyser.names)
{
if(length(analysis.results[[ka]])==0)
stop(sprintf("%s module can not be NULL!",ka))
}
analyser.names<-names(analysis.results)
nonkey.analyser.names<-analyser.names[!analyser.names %in% key.analyser.names]
keytable.id<-data.frame(name=character(),value=integer(),stringsAsFactors = FALSE)
keytable.id.format<-data.frame(name=character(),datatype=character(),stringsAsFactors = FALSE)
nonkeytable.id<-data.frame(name=character(),value=integer(),stringsAsFactors = FALSE)
nonkeytable.id.format<-data.frame(name=character(),datatype=character(),stringsAsFactors = FALSE)
# key
for(analyser in key.analyser.names)
{
result<-analysis.results[[analyser]]
format<-SerializeXmlDoc(paste0(analysis_module.path,analyser,"/featureinfo.xml"),"datatype")
if(CheckTableExistence("hpts",analyser)==FALSE)
CreateTable(format,"hpts",analyser)
if(override)
{
df<-SelectFromDB(result,format,"hpts",analyser)
if(nrow(df)>0)
sub.id<-df[1,]$id
else
sub.id<-InsertToDB(result,format,"hpts",analyser)
}
else
sub.id<-InsertToDB(result,format,"hpts",analyser)
subtable.id<-data.frame(name=analyser,value=sub.id,stringsAsFactors = FALSE)
subtable.id.format<-data.frame(name=analyser,datatype="numerical",stringsAsFactors = FALSE)
keytable.id<-rbind(keytable.id,subtable.id)
keytable.id.format<-rbind(keytable.id.format,subtable.id.format)
}
# non key
for(analyser in nonkey.analyser.names)
{
result<-analysis.results[[analyser]]
format<-SerializeXmlDoc(paste0(analysis_module.path,analyser,"/featureinfo.xml"),"datatype")
if(CheckTableExistence("hpts",analyser)==FALSE)
CreateTable(format,"hpts",analyser)
sub.id<-InsertToDB(result,format,"hpts",analyser)
subtable.id<-data.frame(name=analyser,value=sub.id,stringsAsFactors = FALSE)
subtable.id.format<-data.frame(name=analyser,datatype="numerical",stringsAsFactors = FALSE)
nonkeytable.id<-rbind(nonkeytable.id,subtable.id)
nonkeytable.id.format<-rbind(nonkeytable.id.format,subtable.id.format)
}
if(CheckTableExistence("hpts","main")==FALSE)
{
print("main table does not exist, please create a main table!")
return(0)
}else{
subtable<-rbind(keytable.id,nonkeytable.id)
subtable.format<-rbind(keytable.id.format,nonkeytable.id.format)
#check if the table structure is same to subtable.format. if not, alter table structure in database
CheckAndUpdateMainTableCol(subtable.format$name,"hpts")
if(override)
{
target.row<-SelectFromDB(keytable.id,keytable.id.format,"hpts","main")
if(nrow(target.row)>0)
UpdateForDB(keytable.id,keytable.id.format,nonkeytable.id,nonkeytable.id.format,"hpts","main")
else
InsertToDB(subtable,subtable.format,"hpts","main")
newdata<-SelectFromDB(keytable.id,keytable.id.format,"hpts","main")
mid<-newdata[1,]$id
}
else
{
mid<-InsertToDB(subtable,subtable.format,"hpts","main")
}
return (mid)
}
}
StoreTransformation<-function(main.id, generator.results,analysis.results,override=TRUE,
generator_module.path=path.generator_tools,
analysis_module.path=path.analysis_tools)
{
# store the main table id, generator parameters, analysis result to DB.
# Args:
# generator.results: a list that contain data of a generator. The size of list is 1
# generator_module.path: the directory path of generators
# override: if override record that have same generator parameter and main.id
# analysis.results: a list that contain analysis result of mutiple analyzers
# analysis_module.path: the directory path of analyzers
#
# Returns:
# the id in main table if success, or 0 if fail
#check if the generator.result size =1
if(length(generator.results)!=1)
stop(sprintf("Error length of generator.results in StoreResultForGeneratorToDB,should be 1 but actual be %d",length(generator.results)))
# format analysis.results to subtable
subtable<-data.frame(name=character(),value=character(),stringsAsFactors = FALSE)
subtable.format<-data.frame(name=character(),value=character(),stringsAsFactors = FALSE)
for(i in 1:length(analysis.results))
{
analyser<-names(analysis.results[i])
result<-analysis.results[[i]]
format<-SerializeXmlDoc(paste0(analysis_module.path,analyser,"/featureinfo.xml"),"datatype")
if(CheckTableExistence("hpts",analyser)==FALSE)
CreateTable(format,"hpts",analyser)
sub.id<-InsertToDB(result,format,"hpts",analyser)
subtable<-rbind(subtable,data.frame(name=analyser,value=as.character(sub.id),stringsAsFactors = FALSE))
subtable.format<-rbind(subtable.format,data.frame(name=analyser,datatype="numerical",stringsAsFactors = FALSE))
}
#combine generator.parameter and subtable
generator.name<-names(generator.results[1])
generator.parameters<-generator.results[[1]]
generator.format<-SerializeXmlDoc(paste0(generator_module.path,generator.name,"/variantinfo.xml"),"datatype")
generator.parameters<-rbind(generator.parameters,data.frame(name="instanceId",value=as.character(main.id),stringsAsFactors = FALSE))
generator.format<-rbind(generator.format,data.frame(name="instanceId",datatype="numerical",stringsAsFactors = FALSE))
cond.parameters<-generator.parameters
cond.format<-generator.format
generator.parameters<-rbind(generator.parameters,subtable)
generator.format<-rbind(generator.format,subtable.format)
if(CheckTableExistence("hpts",generator.name)==FALSE)
CreateTable(generator.format,"hpts",generator.name)
else
CheckAndUpdateTableStructure(generator.format$name,"hpts",generator.name)
if(override)
{
target.row<-SelectFromDB(cond.parameters,cond.format,"hpts",generator.name)
if(nrow(target.row)>0)
{
UpdateForDB(cond.parameters,cond.format,subtable,subtable.format,"hpts",generator.name)
newdata<-SelectFromDB(generator.parameters,generator.format,"hpts",generator.name)
generator.table.id<-newdata[1,]$id
}
else
generator.table.id<-InsertToDB(generator.parameters,generator.format,"hpts",generator.name)
}
else
generator.table.id<-InsertToDB(generator.parameters,generator.format,"hpts",generator.name)
return(generator.table.id)
}
parseCombinedData<-function(name,data,datastr){
name<-sub(" ","",name)
r<-eval(parse(text=paste0("data.frame(name=character(),",datastr,"=character(),stringsAsFactors = FALSE)")))
if(length(data)==0)
stop("data is NULL in function parseCombinedData")
if(is.list(data))
{
for(i in 1:length(data))
{
subname<-paste0(name,"_",i)
r<-rbind(r,parseCombinedData(subname,data[i][[datastr]],datastr))
}
}else{
#filter blank for features defination in xml
name<-gsub(" ","",name)
data<-gsub(" ","",data)
r<-eval(parse(text=paste0("data.frame(name=name,",datastr,"=data,stringsAsFactors = FALSE)")))
}
return (r)
}
# serialize xml feature/variant file to a dataframe
SerializeXmlDoc<-function(doc.xml,datastr)
{
doc.list<-xmlToList(doc.xml)
doc.seril<-eval(parse(text=paste0("data.frame(name=character(),",datastr,"=character())")))
for(i in 1:length(doc.list))
{
f<-doc.list[i]
if(names(f)!="feature"&&names(f)!="variant")
stop(sprintf("error format in xml with %s!\n",datastr))
doc.seril<-eval(parse(text=paste0('rbind(doc.seril,parseCombinedData(f$',names(f),'$name,f$',names(f),'$',
datastr,',"',datastr,'"))')))
}
return(doc.seril)
}
GetEnableList<-function(doc.xml,Nameaskey=TRUE)
{
doc.list<-xmlToList(doc.xml)
doc.seril<-data.frame(name=character(),datatype=character())
for(i in 1:length(doc.list))
{
f<-doc.list[i]
if(names(f)!="feature"&&names(f)!="variant")
stop("error format in xml with datatype!\n")
tmp<-eval(parse(text=paste0('parseCombinedData(f$',names(f),
'$name,f$',names(f),'$datatype,"datatype")')))
tmp$enable_variable<-eval(parse(text=paste0('f$',names(f),'$enable_variable')))
tmp$oldname<-gsub(" ","",eval(parse(text=paste0('f$',names(f),'$name'))))
doc.seril<-rbind(doc.seril,tmp)
}
enable.list<-list()
for(i in 1:nrow(doc.seril))
{
tmp<-doc.seril[i,]
enable_str<-sub(tmp$oldname,tmp$name,tmp$enable_variable)
if(Nameaskey)
enable.list[[tmp$name]]<-enable_str
else
enable.list[[enable_str]]<-tmp$name
}
return(enable.list)
}
GetDatatypeList<-function(doc.xml)
{
doc.seril<-SerializeXmlDoc(doc.xml,"datatype")
datatype.list<-list()
for(i in 1:nrow(doc.seril))
{
tmp<-doc.seril[i,]
datatype.list[[tmp$name]]<-tmp$datatype
}
return(datatype.list)
}
# unserialize a dataframe to a xml format file, which need a xmlformat file
UnSerializeXmlDoc<-function(doc.seril,xmlformat)
{
# subformat
FillXMLNode<-function(subformat,data,name)
{
if(is.list(subformat))
{
fnode<-newXMLNode(name = "value",parent = fsnode);
for(i in 1:length(str))
{
subname<-paste0(name,"_",i);
addChildren(fnode,
FillXMLNode(subformat[i],data,name)
);
}
}else{
value<-data[which(data$name==name),];
if(nrow(value)==0)
return (NA);
if(nrow(value)==1)
{
return (newXMLNode(name="value",value));
}else{
stop("stop in Fill FillXMLNode, because there are two atrribute have same name");
}
}
}
doc.xml = newXMLDoc();
fsnode<-newXMLNode(name="features",doc=doc.xml);
xmlformat.list<-xmlToList(xmlformat);
for(i in 1:nrow(xmlformat.list))
{
feature<-xmlformat.list[i,]$feature;
subformat<-feature$value;
xmlnode<-FillXMLNode(subformat,doc.seril,feature$name);
if(is.na(xmlnode))
{
next;
} else{
fnode<-newXMLNode(name = "feature",parent = fsnode);
addChildren(fnode,
newXMLNode(name="name",feature$name),
xmlnode
);
}
}
return (doc.xml);
}
InsertToDB<-function(data,format,
dbname = "hpts", tbname = "ttable"){
# Perform a insert operation using 'data' in 'format' to table 'tbname' of database 'dbname'
#
# Args:
# data: The data need to insert to database, need be a dataframe
# format: The format that specifics data structure and type
# dbname: The database name
# tbname: The table name
#
# Returns:
# The last insert it
stopifnot(is.data.frame(data)==TRUE)
stopifnot(is.vector(data$name)==TRUE)
stopifnot(is.vector(data$value)==TRUE)
formatvalue<-function(data,format)
{
values_str<-""
for(i in 1:nrow(data))
{
fname<-data[i,]$name
value<-data[i,]$value
if(i!=1)
values_str<-paste0(values_str,",")
for(j in 1:nrow(format))
{
if(fname==format[j,]$name)
{
datatype<-format[j,]$datatype
if(datatype=="numerical")
values_str<-paste0(values_str,value)
if(datatype=="category")
values_str<-paste0(values_str,'"',value,'"')
if(datatype=="boolen")
values_str<-paste0(values_str,value)
break
}
}
}
return (values_str)
}
s1 <- paste(data$name, collapse = ",")
s2 <- formatvalue(data,format)
cmd.str <- sprintf('insert into %s.%s(%s) values(%s);', dbname, tbname, s1, s2)
conn <- OpenDB()
result<-sqlQuery(conn, cmd.str)
if(length(result)!=0)
stop(result)
tableid <- sqlQuery(conn, "select last_insert_id()")
return(tableid[[1]])
}
SelectFromDB<-function(condition,format,
dbname = "hpts", tbname = "ttable"){
# Perform a select operation using 'condition' in 'format' to table 'tbname' of database 'dbname'
#
# Args:
# condition: The condition of select, need be a dataframe
# format: The format that specifics condition structure and type
# dbname: The database name
# tbname: The table name
#
# Returns:
# The last insert it
stopifnot(is.data.frame(condition)==TRUE)
stopifnot(is.vector(condition$name)==TRUE)
stopifnot(is.vector(condition$value)==TRUE)
FormatCondition<-function(condition,format)
{
condition.str<-""
for(i in 1:nrow(condition))
{
fname<-condition[i,]$name
value<-condition[i,]$value
for(j in 1:nrow(format))
{
if(fname==format[j,]$name)
{
datatype<-format[j,]$datatype
if(datatype=="numerical")
condition.str<-sprintf(" %s and %s=%s",condition.str,fname,value)
if(datatype=="category")
condition.str<-sprintf(" %s and %s='%s'",condition.str,fname,value)
if(datatype=="boolen")
condition.str<-sprintf(" %s and %s=%s",condition.str,fname,value)
break
}
}
}
return (condition.str)
}
condition.str <- FormatCondition(condition,format)
cmd.str <- sprintf('select * from %s.%s where TRUE %s ;', dbname, tbname, condition.str)
conn <- OpenDB()
result<-sqlQuery(conn, cmd.str)
return (result)
}
RemoveFromDB<-function(condition,format,
dbname = "hpts", tbname = "ttable"){
# Perform a rm operation using 'condition' in 'format' to table 'tbname' of database 'dbname'
#
# Args:
# condition: The condition of select, need be a dataframe
# format: The format that specifics condition structure and type
# dbname: The database name
# tbname: The table name
#
# Returns:
#
stopifnot(is.data.frame(condition)==TRUE)
stopifnot(is.vector(condition$name)==TRUE)
stopifnot(is.vector(condition$value)==TRUE)
FormatCondition<-function(condition,format)
{
condition.str<-""
for(i in 1:nrow(condition))
{
fname<-condition[i,]$name
value<-condition[i,]$value
for(j in 1:nrow(format))
{
if(fname==format[j,]$name)
{
datatype<-format[j,]$datatype
if(datatype=="numerical")
condition.str<-sprintf(" %s and %s=%s",condition.str,fname,value)
if(datatype=="category")
condition.str<-sprintf(" %s and %s='%s'",condition.str,fname,value)
if(datatype=="boolen")
condition.str<-sprintf(" %s and %s=%s",condition.str,fname,value)
break
}
}
}
return (condition.str)
}
condition.str <- FormatCondition(condition,format)
if(condition.str=="")
stop("error ! you will clean up the table")
cmd.str <- sprintf('rm from %s.%s where TRUE %s ;', dbname, tbname, condition.str)
print(cmd.str)
conn <- OpenDB()
result<-sqlQuery(conn, cmd.str)
return (result)
}
UpdateForDB<-function(condition,condition.format,
newdata,newdata.format,
dbname = "hpts", tbname = "ttable"){
# Perform a rm operation using 'condition' in 'format' to table 'tbname' of database 'dbname'
#
# Args:
# condition: The condition of select, need be a dataframe
# format: The format that specifics condition structure and type
# dbname: The database name
# tbname: The table name
#
# Returns:
# 0: there is no need to updata
#
stopifnot(is.data.frame(condition)==TRUE)
stopifnot(is.vector(condition$name)==TRUE)
stopifnot(is.vector(condition$value)==TRUE)
stopifnot(is.data.frame(newdata)==TRUE)
stopifnot(is.vector(newdata$name)==TRUE)
stopifnot(is.vector(newdata$value)==TRUE)
if(nrow(newdata)==0)
return(0)
Format<-function(data,format,linker=",")
{
data.str<-""
for(i in 1:nrow(data))
{
fname<-data[i,]$name
value<-data[i,]$value
for(j in 1:nrow(format))
{
if(fname==format[j,]$name)
{
if(i>1)
data.str<-paste(data.str,linker)
datatype<-format[j,]$datatype
if(datatype=="numerical")
data.str<-sprintf(" %s %s=%s",data.str,fname,value)
if(datatype=="category")
data.str<-sprintf(" %s %s='%s'",data.str,fname,value)
if(datatype=="boolen")
data.str<-sprintf(" %s %s=%s",data.str,fname,value)
break
}
}
}
return (data.str)
}
update.str<- Format(newdata,newdata.format)
condition.str <- Format(condition,condition.format,"and")
if(condition.str=="")
stop("error ! you will update the whole table")
cmd.str <- sprintf('update %s.%s set %s where TRUE and %s ;', dbname, tbname, update.str,condition.str)
conn <- OpenDB()
result<-sqlQuery(conn, cmd.str)
return (result)
}
================================================
FILE: framework/EvaluatorModule/Evaluator.R
================================================
PAK.Evaluator<-setRefClass(
"PAK.Evaluator",
fields = list(sub.evaluators = "list",
analysers.results.store ="list",
analysers.results.evaluate="list"),
methods = list(
#init function
initialize=function(sub.evaluators=list()){
sub.evaluators<<-sub.evaluators
},
# the max score is 0
getScore=function(analysers.results){
score<-0
for(analyser in names(sub.evaluators))
{
analyser.evaluator<-sub.evaluators[[analyser]]
analyser.results<-analysers.results[[analyser]]
#for each measurement
for(m.name in names(analyser.results))
{
m.result<-analyser.results[[m.name]]
score<-score+analyser.evaluator[[m.name]](m.result)
}
}
return (score)
},
evaluate=function(app){
for(aly in names(sub.evaluators))
{
one.analyser<-PAK.Analyser$new(name=aly,features=names(sub.evaluators[[aly]]),app=app)
one.analyser$anaylze()
analysers.results.store[[aly]]<<-one.analyser$getResultForDB()
analysers.results.evaluate[[aly]]<<-one.analyser$getResult()
}
score<-getScore(analysers.results.evaluate)
return (score)
},
getResults2store=function(){
return (analysers.results.store)
}
)
)
================================================
FILE: framework/ExtractorModule/Analyze.old
================================================
library(XML)
InstanceAnaylze<-function(analyser,analyser.features,app,
analysis_module.path="/home/lyl/program/hpts/web/script/ExtractorModule/anaylsis_module/"){
# perform analysis on target running instance
#
# Args:
# analyzer:a string give generator name
# analyser.feature: a string vector that contain enable variable for feauture to analysis
# app:a string give target program name(with path)
envstr<-""
for(i in 1:length(analyser.features))
{
envvar<-gsub(" ","",analyser.features[i])
envstr<-paste(envstr,sprintf("export %s=TRUE;",envvar))
}
app.name<-basename(app)
app.path<-dirname(app)
#go applications directions
envstr<-paste0(envstr,'cd ',app.path,';')
envstr<-paste0(envstr,'sh ',analysis_module.path,analyser,"/analysis.sh ",app.name)
r<-system(envstr,intern = TRUE)
resultfile<-paste0(app.path,"/",r)
anaylsis.result<-xmlToDataFrame(paste0(app.path,"/",r),stringsAsFactors=FALSE)
return(anaylsis.result)
}
#anaylsis.result<-InstanceAnaylze("tau",c("ENABLE_P_WALL_CLOCK_TIME"),"/home/lyl/program/hpts/applications/optimized_1.cpp")
================================================
FILE: framework/ExtractorModule/Extractor.R
================================================
PAK.Extractor<-setRefClass(
"PAK.Extractor",
fields = list(analysers = "list",
analysers.results.store="list",
analysers.results.produce="list"
),
methods = list(
#init function
initialize=function(analysers=list()){
analysers<<-analysers
},
# the max score is 0
extractFeatures=function(app){
for(aly in names(analysers))
{
one.analyser<-PAK.Analyser$new(name=aly,features=analysers[[aly]],app=app)
one.analyser$anaylze()
analysers.results.store[[aly]]<<-one.analyser$getResultForDB()
analysers.results.produce[[aly]]<<-one.analyser$getResult()
}
return (analysers.results.produce)
},
getFeatures2produce=function(){
return (analysers.results.produce)
},
getFeatures2store=function(){
return (analysers.results.store)
}
)
)
================================================
FILE: framework/Interface/Analyser.R
================================================
PAK.Analyser<-setRefClass(
"PAK.Analyser",
fields = list(app = "character",
name = "character",
features = "character",
result = "data.frame",
path = "character",
enable.list ="list",
datatype.list="list"),
methods = list(
#init function
initialize=function(name="",app="",features=character()){
name<<-name
app<<-app
features<<-features
path<<-path.analysis_tools
enable.list<<-GetEnableList(paste0(path,"/",name,"/featureinfo.xml"),Nameaskey = TRUE)
datatype.list<<-GetDatatypeList(paste0(path,"/",name,"/featureinfo.xml"))
},
enableAllfeatures=function(){
features<<-names(enable.list)
},
anaylze=function(){
# perform analysis on target running instance
#
# Args:
# analyzer:a string give generator name
# analyser.feature: a string vector that contain enable variable for feauture to analysis
# app:a string give target program name(with path)
envstr<-""
if(length(features)==0)
stop(sprintf("error! features of analyser %s is NULL!",name))
for(i in 1:length(features))
{
envvar<-gsub(" ","",enable.list[[features[i]]])
envstr<-paste(envstr,sprintf("export %s=TRUE;",envvar))
}
app.name<-basename(app)
app.path<-dirname(app)
#go applications directions
envstr<-paste0(envstr,'cd ',app.path,';')
envstr<-paste0(envstr,'sh ',path,name,"/analysis.sh ","./",app.name)
r<-system(envstr,intern = TRUE)
resultfile<-paste0(app.path,"/",r)
result<<-xmlToDataFrame(paste0(app.path,"/",r),stringsAsFactors=FALSE)
},
getResultForDB=function()
{
return (result)
},
getResult=function()
{
result.evaluate<-list()
for(i in 1:nrow(result))
{
tmp<-result[i,]
if(datatype.list[[tmp$name]]!="category")
result.evaluate[[tmp$name]]<-as.numeric(tmp$value)
else
result.evaluate[[tmp$name]]<-tmp$value
}
return (result.evaluate)
}
)
)
================================================
FILE: framework/Interface/Generator.R
================================================
library("methods")
PAK.Generator<-setRefClass(
"PAK.Generator",
fields = list(app = "character",
name = "character",
parameters = "data.frame",
output = "character",
result = "character",
path="character"),
methods = list(
#init function
initialize=function(name="",app="",output="",p.os=NA){
name<<-name
app<<-app
output<<-output
path<<-path.generator_tools
if(is.data.frame(p.os))
setPUsingSpace(p.os)
},
transform=function(){
# perform code transforming on target program
envstr<-""
for(i in 1:nrow(parameters))
{
envvar.name<-gsub(" ","",parameters[i,]$enable_variable)
envvar.value<-gsub(" ","",parameters[i,]$parameter)
envstr<-paste(
envstr,sprintf("export %s='%s';",
envvar.name,
envvar.value))
}
#go applications directions
app.name<-basename(app)
app.path<-dirname(app)
envstr<-paste0(envstr,'cd ',app.path,';')
envstr<-paste0(envstr,'sh ',path,name,
"/transform.sh ",app.name," ",output)
variant.file<-system(envstr,intern = TRUE)
result<<-paste0(app.path,"/",variant.file)
},
setPUsingSpace=function(p.in.space){
enable.list<-GetEnableList(paste0(path,"/",name,"/variantinfo.xml"))
parameters<<-data.frame(enable_variable=character(),
parameter=character())
for(p in names(p.in.space) )
parameters<<-
rbind(parameters,data.frame(enable_variable=enable.list[[p]],
parameter=p.in.space[[p]]))
},
getParameterForDB=function(){
enable.list<-GetEnableList(paste0(path,"/",name,"/variantinfo.xml"),FALSE)
df<-as.data.frame(do.call(rbind,mapply(SIMPLIFY = FALSE,function(e,p)
{data.frame(name=enable.list[[e]],value=p,stringsAsFactors = FALSE)},
parameters$enable_variable,parameters$parameter)))
return(df)
}
)
)
================================================
FILE: framework/LearnerModule/Learner.R
================================================
# the Learner module
PAK.Learner<-setRefClass(
"PAK.Learner",
fields = list(model="list",dv.name="character",
idv.name="character"),
methods = list(
#init function
initialize=function(){
},
learnModel=function(training.data,idv,dv){
}
)
)
================================================
FILE: framework/OptimizerModule/Optimizer.R
================================================
PAK.Optimizer<-setRefClass(
"PAK.Optimizer",
fields = list(generator.name ="character",
generator= "PAK.Generator",
output.name="character"),
methods = list(
#init function
initialize=function(generator.name=NA,output.name="optimized.cpp"){
output.name<<-output.name
if(is.character(generator.name))
{
generator.name<<-generator.name
generator<<-PAK.Generator$new(name=generator.name,output=output.name)
}
},
optimize=function(app,parameters){
generator$setPUsingSpace(parameters)
generator$app<<-app
r<-generator$transform()
return(r)
},
getParameters2store=function(){
parameters.list<-list()
print(generator$getParameterForDB())
print(generator.name)
parameters.list[[generator.name]]<-generator$getParameterForDB()
return(parameters.list)
}
)
)
================================================
FILE: framework/ProducerModule/Producer.R
================================================
# a class to producer parameter
PAK.Producer<-setRefClass(
"PAK.Producer",
fields = list(),
methods = list(
# init function
initialize=function(){
},
getParameter=function(step,extractor.result,last.score)
{
}
)
)
================================================
FILE: framework/Tuning/Tuner.R
================================================
library("methods")
PAK.Tuner<-setRefClass(
"PAK.Tuner",
fields = list(app = "character",
extractor ="PAK.Extractor",
evaluator = "PAK.Evaluator",# list(analyser=list(name=eval.fun)): name is measurement name, and eval.fun is its evaluate function
producer ="PAK.Producer",#parameter producer
optimizer ="PAK.Optimizer",#optimizer
need.store ="logical",
best.score="numeric",
best.parameters="data.frame",
best.results="list"),
methods = list(
#init function
initialize=function(app,extractor=PAK.Extractor$new(),evaluator,producer,optimizer,need.store=FALSE){
app<<-app
extractor<<-extractor
evaluator<<-evaluator
producer<<-producer
optimizer<<-optimizer
need.store<<-need.store
},
checkpoint=function(step,score,r.producer,r.best.score,r.best.parameters,r.best.result){
if(length(dir(pattern="stoptuning"))!=0)
{
print("stop tuning!")
save(step,score,r.producer,r.best.score,r.best.parameters,r.best.result,file="tuningRecord")
return (TRUE)
}
return (FALSE)
},
tune=function(resume.file=NA){
step<-0
score<-numeric()
extractor$extractFeatures(app)
if(!is.na(resume.file))
{
print("load file")
load(resume.file)
producer<<-r.producer
best.score<<-r.best.score
best.parameters<<-r.best.parameters
best.results<<-r.best.result
}
while(TRUE)
{
step<-step+1
cat(sprintf("step : %d",step))
#produce parameters
parameters<-producer$getParameter(step=step,
extractor.result=extractor$getFeatures2produce(),
last.score=score)
#search is end
if(length(parameters)==0)
break
optimized.instance<-optimizer$optimize(app,parameters)
print(parameters)
score<-evaluator$evaluate(optimized.instance)
if(need.store)
store2DB()
if(step==1||score>best.score)
{
best.score<<-score
best.parameters<<-parameters
best.results<<-evaluator$analysers.results.evaluate
}
print(score)
if(score==0||checkpoint(step,score,producer,best.score,best.parameters,best.results))
break
cat("end a step \n")
}
},
store2DB=function(){
keyAnalyser.result<-PerformKeyAnalysis(app)
unkeyAnalyser.result<-extractor$getFeatures2store()
Analyser.result<-rbind(keyAnalyser.result,unkeyAnalyser.result)
main.id<-StoreAnalysis(keyAnalyser.result,override = TRUE)
StoreTransformation(main.id,optimizer$getParameters2store(),evaluator$getResults2store())
}
)
)
================================================
FILE: framework/dependencies.R
================================================
library("XML")
library("jsonlite")
library("RODBC")
library("methods")
print("load dependencies")
================================================
FILE: framework/lib/OptimizationSpace.R
================================================
GenerationParameterSpace<-function(parameter.list)
{
loop.str<-""
df.str<-""
for(i in 1:length(parameter.list))
{
range<-unlist(strsplit(parameter.list[[i]],";"))[1]
condition<-unlist(strsplit(parameter.list[[i]],";"))[2]
parameter.name<-names(parameter.list[i])
if(is.na(condition))
tmp.str<-sprintf("for(%s in c(%s) )\n",names(parameter.list[i]),range)
else
tmp.str<-sprintf("for(%s in c(%s) )\n if(%s)\n",names(parameter.list[i]),range,condition)
loop.str<-paste0(loop.str,tmp.str)
df.str<-paste0(df.str,sprintf("%s=%s",
parameter.name,parameter.name))
if(i!=length(parameter.list))
df.str<-paste0(df.str,",")
}
df.str<-paste0("combinations.tmp<-data.frame(",df.str,",stringsAsFactors = FALSE)")
body.str<-paste0(df.str,"\nif(nrow(combinations.parameter)==0)\n",
"combinations.parameter<-combinations.tmp\n",
"else\ncombinations.parameter<-rbind(",
"combinations.parameter,combinations.tmp)")
space.str<-paste0(loop.str,"{\n",body.str,"\n}")
combinations.parameter<-data.frame()
eval(parse(text=space.str))
return(combinations.parameter)
}
#combinations.parameter<-GenerationSpace(parameter.list)
================================================
FILE: framework/lib/learners.R
================================================
# an decision tree learner
PAK.Learner.DecisionTree<-setRefClass(
"PAK.Learner.DecisionTree",
contains="PAK.Learner",
fields = list(model="list",dv.name="character",
idv.name="character"),
methods = list(
#init function
initialize=function(){
},
learnModel=function(training.data,idv,dv){
buildstr<-sprintf("rpart(%s~.,training.data)",dv)
model[["rp"]]<<-eval(parse(text=buildstr) )
model[["dv.name"]]<<-dv
model[["idv.name"]]<<-idv
return (model)
}
)
)
================================================
FILE: framework/lib/producers.R
================================================
library(rpart)
PAK.Producer.Exhaustion<-setRefClass(
"PAK.Producer.Exhaustion",
contains="PAK.Producer",
fields = list(parameter.space="data.frame"),
methods = list(
#init function
initialize=function(parameter.list){
parameter.space<<-GenerationParameterSpace(parameter.list)
},
getParameter=function(step,extractor.result,last.score)
{
if(step<=nrow(parameter.space))
{
if(ncol(parameter.space)==1)
return (eval(parse(text=sprintf("data.frame(%s=parameter.space[step,])",names(parameter.space)))))
else
return(parameter.space[step,])
}
}
)
)
PAK.Producer.Greedy<-setRefClass(
"PAK.Producer.Greedy",
contains="PAK.Producer",
fields = list(parameter.range="list",#list(p1=c(1,2,3..),p2=c(1,2,3..)..)
v.idx ="numeric",
v.score ="numeric",
v.pos ="numeric",
local.optimal= "data.frame"),
methods = list(
#init function
initialize=function(parameter.range=list()){
parameter.range<<-parameter.range
},
getParameter=function(step,extractor.result,last.score)
{
if(step==1)
{
local.optimal<<-defulat.parameters
v.idx<<-1
v.score<<-c()
v.pos<<-0
}
else if(v.posparameter.number)
return (data.frame())
v.score<<-c()
v.pos<<-0
}
new.parameter<-local.optimal
new.parameter[[v.idx]]<-parameter.range[[v.idx]][v.pos+1]
return (new.parameter)
}
)
)
PAK.Producer.OptimalSpace<-setRefClass(
"PAK.Producer.OptimalSpace",
contains="PAK.Producer",
fields = list(parameter.space="data.frame"),
methods = list(
#init function
initialize=function(tid){
os<-PredictOptimalSpace(tid,combine.os=c(1,3,5,8,10),os.size=25,model=os.models[[1]],cond.str=cond.str,omp.number = 16)
convertOS2parameterspace(os)
cat("length of os: ",nrow(parameter.space),"\n")
print(parameter.space)
},
getParameter=function(step,extractor.result,last.score)
{
if(stepnrow(dv.prs))
parameter<-eval(parse(text=sprintf("data.frame(%s=NULL)",dv.name)))
else
parameter<-eval(parse(text=sprintf("data.frame(%s='%s')",dv.name,dv.prs$type[step])))
return (parameter)
}
)
)
================================================
FILE: generator_module/optimizeCompilerFlag/transform.sh
================================================
#!/bin/bash
CC=gcc
#export ENABLE_COMPILERFLAG="-O3"
CFLAGENABLE=`env |grep ENABLE_COMPILERFLAG`
FNAME=${CFLAGENABLE#ENABLE_}
FNAME=${FNAME%%=*}
SUFFIX=${FNAME##*_}
FNAME=${FNAME%%_*}
if [ $FNAME = "COMPILERFLAG" ] ; then
CFLAG=${CFLAGENABLE#*=}
$CC $CFLAG $1 -o $2
fi
echo $2
================================================
FILE: generator_module/optimizeCompilerFlag/variantinfo.xml
================================================
compilerflag
the compiler flag
ENABLE_COMPILERFLAG
string
================================================
FILE: pak.R
================================================
path.analysis_tools<-"/home/liujh/PAK-master/analysis_module/"
path.generator_tools<-"/home/liujh/PAK-master/generator_module/"
odbc.source<-"kdb"
database.user<-"user"
database.pwd<-"pwd"
GetSourceFileDir<-function()
{
frame_files <- lapply(sys.frames(), function(x) x$ofile)
frame_files <- Filter(Negate(is.null), frame_files)
path.sourcefile <- dirname(frame_files[[length(frame_files)]])
return(path.sourcefile)
}
SourceDir <- function(path, trace = FALSE)
{
#if(missing(path)) path <- getwd()
for(i in 1:length(path))
{
for (nm in list.files(path, pattern = ".[Rr]",recursive=TRUE))
{
if(trace) cat(nm,":")
source(file.path(path, nm))
if(trace) cat("\n")
}
}
}
sfdir<-GetSourceFileDir()
source(file.path(sfdir,"framework/dependencies.R"))
SourceDir(file.path(sfdir,"framework/Interface"))
SourceDir(file.path(sfdir,"framework/ExtractorModule"))
SourceDir(file.path(sfdir,"framework/ProducerModule"))
SourceDir(file.path(sfdir,"framework/OptimizerModule"))
SourceDir(file.path(sfdir,"framework/EvaluatorModule"))
SourceDir(file.path(sfdir,"framework/LearnerModule"))
SourceDir(file.path(sfdir,"framework/DBModule"))
SourceDir(file.path(sfdir,"framework/Tuning"))
SourceDir(file.path(sfdir,"framework/lib"))
#PerformKeyAnalysis<-function(app){
# key.analyser.names<-c("appinfo","envinfo")
#result<-list()
#for(ka in key.analyser.names)
#{
# analyser<-PAK.Analyser$new(name=ka,app)
#analyser$enableAllfeatures()
#analyser$anaylze()
#result[[ka]]<-analyser$getResultForDB()
# }
# return (result)
#}
================================================
FILE: tutorial/autotuning_compilerflag.R
================================================
GetSourceFileDir<-function()
{
frame_files <- lapply(sys.frames(), function(x) x$ofile)
frame_files <- Filter(Negate(is.null), frame_files)
path.sourcefile <- dirname(frame_files[[length(frame_files)]])
return(path.sourcefile)
}
sfdir<-GetSourceFileDir()
source(file.path(sfdir,"../pak.R"))
# Exhaustion algorithm autotuner
if(TRUE || FALSE)
{
app<-"/home/liujh/PAK-master/applications/multiplyexample.c"
parameter.list<-list()
parameter.list["compilerflag"]<-'" -O0" ," -O1"," -O2"," -O3";'
myproducer<-PAK.Producer.Exhaustion$new(parameter.list)
myoptimizer<-PAK.Optimizer$new(generator.name="optimizeCompilerFlag")
myevaluator<-PAK.Evaluator$new(sub.evaluators=list(paktimer=list(time=function(x){if(x>0) return (0-x) else return(0)})))
tuning<-PAK.Tuner$new(app=app,optimizer=myoptimizer,evaluator=myevaluator,producer =myproducer,need.store=FALSE)
tuning$tune()
}