[
  {
    "path": "LICENSE",
    "content": "Copyright (c) 2016, Simon Fraser University\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without \nmodification, are permitted provided that the following conditions are met: \n1. Redistributions of source code must retain the above copyright notice, \n   this list of conditions and the following disclaimer.\n2. Redistributions in binary form must reproduce the above copyright notice, \n   this list of conditions and the following disclaimer in the documentation \n   and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" \nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE \nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE \nARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE \nLIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR \nCONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF \nSUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS \nINTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN \nCONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) \nARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE \nPOSSIBILITY OF SUCH DAMAGE. \n\n\n"
  },
  {
    "path": "Makefile",
    "content": "USE_CPU := -DCPU_ONLY=0\n\n# TODO: Update next 2 directories\nCAFFE_LSTM_DIR := <SomePath1>/caffe-lstm\nDLIB_DIR := <SomePath2>/dlib\n\nRM := rm -rf\nCC := g++\nCC_OPTIONS = -std=gnu++0x -Wall -c -fmessage-length=0 -O3 $(USE_CPU)\nCFLAGS = -fPIC $(CC_OPTIONS)\n\n# TODO: Add/Remove if needed (e.g. Opencv directories)\nINCS_DIRS := -I$(CAFFE_LSTM_DIR)/include -I$(CAFFE_LSTM_DIR)/build/src -I$(DLIB_DIR)\n#             -I<SomePath3>/LIB/OPENCV/3.0.0-CUDA65/include    \\\n#             -I/usr/include/openblas    \\\n#             -I/usr/local/cuda-6.5/include    \\\n#             -I<SomePath3>/LIB/BOOST/1.57.0/include    \\\n#             -I<SomePath3>/LIB/GLOG/0.3.3/include    \\\n#             -I<SomePath3>/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7\n\nLIBS_DIRS := -L$(DLIB_DIR) -L$(CAFFE_LSTM_DIR)/build/lib\n#             -L<SomePath3>/LIB/OPENCV/3.0.0-CUDA65/lib    \\\n#             -L/usr/lib    \\\n#             -L/usr/local/cuda-6.5/lib64    \\\n#             -L<SomePath3>/LANG/PYTHON/2.7.6-SYSTEM/lib    \\\n#             -L<SomePath3>/LIB/GLOG/0.3.3/lib    \\\n#             -L<SomePath3>/LIB/BOOST/1.57.0/lib    \\\n#             -L/cs/vml2/msibrahi/workspaces/software/dlib/examples/build/dlib_build\n\nLIBS := -lboost_system -lboost_filesystem -lboost_chrono -lboost_python            \\\n        -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_ml                \\\n        -lpython2.7 -lleveldb -lprotobuf -lgflags -lglog -pthread -lcaffe -ldlib\n\n############################################################\n\n\nSRC_BASE := src\nSRC_BASE_OUT = release\nAPP_BASE := apps\nAPP_BASE_OUT = apps-release\n\nSRCS := $(wildcard  $(SRC_BASE)/*.cpp)\nOBJS := $(addprefix $(SRC_BASE_OUT)/, $(patsubst %.cpp,%.o,$(notdir $(SRCS))))\nDEPS := $(addprefix $(SRC_BASE_OUT)/, $(patsubst %.cpp,%.d,$(notdir $(SRCS))))\n\nifneq ($(MAKECMDGOALS),clean)\n\tifneq ($(strip $(DEPS)),)\n\t\t-include $(DEPS)\n\tendif\nendif\n\nTARGET1 = exePhase1_2\nTARGET2 = exePhase3\nTARGET3 = exePhase4\n\nall:\n\tmkdir -p $(SRC_BASE_OUT)\n\tmkdir -p $(APP_BASE_OUT) \n\t$(MAKE) $(MAKEFILE) $(TARGET1)\n\t$(MAKE) $(MAKEFILE) $(TARGET2)\n\t$(MAKE) $(MAKEFILE) $(TARGET3)\n\n$(SRC_BASE_OUT)/%.o: $(SRC_BASE)/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\t$(CC) $(CFLAGS) $(INCS_DIRS) -fPIC -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n$(APP_BASE_OUT)/%.o: $(APP_BASE)/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\t$(CC) $(CFLAGS) $(INCS_DIRS) -fPIC -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n$(TARGET1): $(OBJS) $(APP_BASE_OUT)/$(TARGET1).o\n\t@echo 'Building TARGET1: $@'\n\t@echo 'Invoking: GCC C++ Linker'\n\t$(CC) $(LIBS_DIRS) -o $(TARGET1) $(SRC_BASE_OUT)/*.o $(APP_BASE_OUT)/$(TARGET1).o $(LIBS)\n\t@echo 'Finished building TARGET1: $@'\n\t@echo ' '\n\t\n$(TARGET2): $(OBJS) $(APP_BASE_OUT)/$(TARGET2).o\n\t@echo 'Building TARGET2: $@'\n\t@echo 'Invoking: GCC C++ Linker'\n\t$(CC) $(LIBS_DIRS) -o $(TARGET2) $(SRC_BASE_OUT)/*.o $(APP_BASE_OUT)/$(TARGET2).o $(LIBS)\n\t@echo 'Finished building TARGET2: $@'\n\t@echo ' '\n\t\n$(TARGET3): $(OBJS) $(APP_BASE_OUT)/$(TARGET3).o\n\t@echo 'Building TARGET3: $@'\n\t@echo 'Invoking: GCC C++ Linker'\n\t$(CC) $(LIBS_DIRS) -o $(TARGET3) $(SRC_BASE_OUT)/*.o $(APP_BASE_OUT)/$(TARGET3).o $(LIBS)\n\t@echo 'Finished building TARGET3: $@'\n\t@echo ' '\t\n\nclean:\n\t-$(RM) $(SRC_BASE_OUT)\n\t-$(RM) $(APP_BASE_OUT)\n\t-$(RM) $(TARGET1)\n\t-$(RM) $(TARGET2)\n\t-$(RM) $(TARGET3)\n\n.PHONY: clean all\n"
  },
  {
    "path": "README.md",
    "content": "## [A Hierarchical Deep Temporal Model for Group Activity Recognition. Mostafa S. Ibrahim, Srikanth Muralidharan, Zhiwei Deng, Arash Vahdat, Greg Mori.  IEEE Computer Vision and Pattern Recognition 2016](http://www.cs.sfu.ca/~mori/research/papers/ibrahim-cvpr16.pdf)\n\n## Contents\n0. [History](#history)\n0. [Abstract](abstract)\n0. [Model](#model)\n0. [Dataset](#dataset)\n0. [Experiments](#experiments)\n0. [Installation](#installation)\n0. [License and Citation](#license-and-citation)\n0. [Poster and Powerpoint](#poster-and-powerpoint)\n\n## History\n* The first version of this work is accepted at CVPR 2016.\n* An extended work is uploaded on arxiv. [Link](http://arxiv.org/pdf/1607.02643v1.pdf).\n* This version builds on the previous version to include the following:\n  * We have collected an expanded Volleyball dataset that is 3 times larger than CVPR submission.\n  * We conducted further analysis of experimental results and included comparisons to an additional set of baseline methods.\n  * We implemented a variant of our approach to perform spatial pooling strategies over people.\n* The provided dataset is the expanded version. Please use and compare against this version.\n\n## Abstract\nIn group activity recognition, the temporal dynamics of the whole activity can be inferred based on the dynamics of the individual people representing the activity. We build a deep model to capture these dynamics based on LSTM models. To make use of these observations, we present a **2-stage deep temporal model for the group activity recognition** problem.  In our model, a LSTM model is designed to represent **action dynamics of individual people** in a sequence and another LSTM model is designed to **aggregate person-level information** for whole activity understanding.  We evaluate our model over two datasets: the Collective Activity Dataset and a new volleyball dataset.\n\n## Model\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/fig1.png\" alt=\"Figure 1\" height=\"400\" >\n\n**Figure 1**: High level figure for group activity recognition via a hierarchical model. Each person in a scene is modeled using a temporal model that captures his/her dynamics, these models are integrated into a higher-level model that captures scene-level activity.\n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/fig2-b.png\" alt=\"Figure 2\" height=\"400\" >\n\n**Figure 2**: Detailed figure for the model. Given tracklets of K-players, we feed each tracklet in a CNN, followed by a person LSTM layer to represent each player's action. We then pool over all people's temporal features in the scene. The output of the pooling layer is feed to the second LSTM network to identify the whole teams activity.\n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/fig3.jpg\" alt=\"Figure 3\" height=\"400\" >\n\n**Figure 3**: Previous basic mode drops spatial information. In updated model, 2-group pooling to capture spatial arrangements of players.\n\n## Dataset\n\n### [NEW Download Link (all below combined google drive](https://drive.google.com/drive/folders/1rmsrG1mgkwxOKhsr-QYoi9Ss92wQmCOS?usp=sharing). \n\n### [Old Download Link](http://vml.cs.sfu.ca/wp-content/uploads/volleyballdataset/volleyball.zip). \nIf links don't work at some point, please email me (mostafa.saad.fci@gmail.com)\n\n**Download Error**: Got quota issue? Google 'How To Fix Google Drive Download Quota Exceeded'\n\n**UPDATE 1**: many people asked for extracted trajectories. In fact, as in our code, we generate them on the fly using Dlib Tracker. I extrated and saved them to disk (I did few verifications). Hopefully this helps more. [Download](https://drive.google.com/file/d/0B_rSt5dGmwYBQkh2WFNKTjBSeWM/view?usp=sharing).\n\n**UPDATE 2**: My College, Jiawei (Eric) He, Recently trained 2 Faster-RCNN detectors using the training detections. One detector just detects the person. The other one detects the action of the person. Each row has format: [Image name # of detections x y w h confidence category (for each detection)]. Multiple scenarios such data can be useful and cut your time. I did few verifications over them. Notice, these data are not used in our models. They are provided to help :). [Download](https://drive.google.com/file/d/0B_rSt5dGmwYBQXVqLUNKd3FUdVE/view?usp=sharing).\n\n**UPDATE 3 - NEW**: Special thanks for Norimichi Ukita (a [professor](https://www.toyota-ti.ac.jp/Lab/Denshi/iim/ukita/) at Toyota Technological Institute) for providing manual annotations for the trajectories on all video sequences. [Download](https://drive.google.com/open?id=1M-fXmAVw8WyFr30xb-LMi_Z-Qiv2nFzl). Kindely checkout the README file for data format and cite their paper if used the annotations (Heatmapping of People Involved in Group Activities, Kohei Sendo and Norimichi Ukita, MVA 2019) \n\n**UPDATE 4 - NEW**: Special thanks for Mauricio Perez. In their recent paper: [Skeleton-based relational reasoning for group activity analysis](https://www.sciencedirect.com/science/article/abs/pii/S0031320321005409) they manually annotated the ball locations in the frames. Kindely cite their paper if you used their [dataset extension](https://drive.google.com/file/d/1urZpZiiepC85JD1u3VeURgUpztRgI0yl/edit)\n\n\nWe collected a new dataset using publicly available **YouTube volleyball** videos. We annotated **4830 frames** that were handpicked from **55 videos** with 9 player action labels and 8 team activity labels. \n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/dataset1.jpg\" alt=\"Figure 3\" height=\"400\" >\n\n**Figure 3**: A frame labeled as Left Spike and bounding boxes around each team players is annotated in the dataset.\n\n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/dataset2.jpg\" alt=\"Figure 4\" height=\"400\" >\n\n**Figure 4**: For each visible player, an action label is annotaed.\n\nWe used 3493 frames for training, and the remaining 1337 frames for testing. The train-test split of is performed at video level, rather than at frame level so that it makes the evaluation of models more convincing. The list of action and activity labels and related statistics are tabulated in following tables:\n\n|Group Activity Class|No. of Instances|\n|---|---|\n|Right set|644|\n|Right spike|623|\n|Right pass|801|\n|Right winpoint|295|\n|Left winpoint|367|\n|Left pass|826|\n|Left spike|642|\n|Left set|633|\n\n\n|Action Classes|No. of Instances|\n|---|---|\n|Waiting|3601|\n|Setting|1332|\n|Digging|2333|\n|Falling|1241||\n|Spiking|1216|\n|Blocking|2458|\n|Jumping|341|\n|Moving|5121|\n|Standing|38696|\n\n**Further information**:\n* The dataset contains 55 videos. Each video has a folder for it with unique IDs (0, 1...54)\n * **Train Videos**: 1 3 6 7 10 13 15 16 18 22 23 31 32 36 38 39 40 41 42 48 50 52 53 54\n * **Validation Videos**: 0 2 8 12 17 19 24 26 27 28 30 33 46 49 51\n * **Test Videos**: 4 5 9 11 14 20 21 25 29 34 35 37 43 44 45 47\n* Inside each video directory, a set of directories corresponds to annotated frames (e.g. volleyball/39/29885)\n  * Video 39, frame ID 29885\n* Each frame directory has 41 images (20 images before target frame, **target frame**, 20 frames after target frame)\n  * E.g. for frame ID: 29885 => Window = {29865, 29866.....29885, 29886....29905}\n  * Scences change quite rapidly in volleyball, hence frames beyond that window shouldn't represent belong to target frame most of time.\n  * In our work, we used 5 before and 4 after frames.\n* Each video directory has annotations.txt file that contains selected frames annotations.\n* Each annotation line in format: {Frame ID} {Frame Activity Class} {Player Annotation}  {Player Annotation} ...\n  * Player Annotation corresponds to a tight bounding box surrounds each player\n* Each {Player Annotation} in format: {Action Class} X Y W H\n* Videos with resolution of 1920x1080 are: 2 37 38 39 40 41 44 45 (8 in total). All others are 1280x720.\n\n## Experiments\n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/img/table-ac.png\" alt=\"Figure 5\" height=\"300\" >\n\n**Table 1**: Comparison of the team activity recognition performance of baselines against our model evaluated on the Volleyball Dataset. Experiments are using 2 group styles with max pool strategy. Last 3 entries comparison against Improved Dense Trajectories approach.\n\n## Installation\n* There are 2 internal projects: a simple one for sake of validation and other one, the real pipeline.\n* Download and Install [Dlib library](http://dlib.net/).\n* Download and Install [Caffe-LSTM library](https://github.com/junhyukoh/caffe-lstm). \n  * Assume your download disk path is `$lstm_path`\n* `cd $lstm_path/examples`\n* `git clone https://github.com/mostafa-saad/deep-activity-rec.git`\n* Open makefile at examples/deep-activity-rec\n  * `Update` path locations of variables CAFFE_LSTM_DIR and DLIB_DIR\n  * `Update` the INCS_DIRS and LIBS_DIRS (based on your environment)\n* Open examples/deep-activity-rec/ibrahim16-cvpr-simple/script-simple.sh\n  * Update path variable for CAFFE\n* `cd examples/deep-activity-rec`\n* Compile code: `make all`\n* `cd ../..`\n* Run: `examples/deep-activity-rec/ibrahim16-cvpr-simple/script-simple.sh`\n * Make sure top console lines don't complain about \"NOT exist directory\".\n * You may validate overall console processing with file script-simple-expected-log.txt\n * If so, fix it, use script-clean.sh, run script-simple.sh.\n* The code process is of multiple stages as outlined in the script file.\n  * Processing should end with simple accuracy table, all of it being close to zeros.\n  * The key is to check the console log and to make sure there are no errors found.\n  * Otherwise, read the script and try to get the different phases and read logs to get the errors.\n  * Every sub-directory under ibrahim16-cvpr-simple has 1 or more logs.\n  * Directory p4-network2 should have the final model and accuracy table.\n* If everything went alright, we can proceed with actual pipeline.\n* Download the dataset to path deep-activity-rec/volleyball\n * Same directory structure as given deep-activity-rec/volleyball-simple\n* Whatever steps/changes you did for ibrahim16-cvpr-simple, do it for ibrahim16-cvpr.\n* Run: `examples/deep-activity-rec/ibrahim16-cvpr/script.sh`\n* GPU/CPU note:\n  * The script.sh has 2 heavy processing phases that needs CPU.\n  * One can also run the following 2 scripts in parallel on CPU: script-p1-data.sh and script-p2-data-fuse.sh\n  * Then Run on GPU following script: script-p1-train-p3-p4.sh\n  * The main script runs all these scripts in the required order.\n\n## License and Citation\n\nSource code is released under the **BSD 2-Clause license**\n\nIn case using our extended dataset, please site the following 2 publications. Otherwise, cite a suitable subset of them:\n\n    @inproceedings{msibrahiCVPR16deepactivity,\n      author    = {Mostafa S. Ibrahim and Srikanth Muralidharan and Zhiwei Deng and Arash Vahdat and Greg Mori},\n      title     = {A Hierarchical Deep Temporal Model for Group Activity Recognition.},\n      booktitle = {2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},\n      year      = {2016}\n    }\n\n    @inproceedings{msibrahiPAMI16deepactivity,\n      author    = {Mostafa S. Ibrahim and Srikanth Muralidharan and Zhiwei Deng and Arash Vahdat and Greg Mori},\n      title     = {Hierarchical Deep Temporal Models for Group Activity Recognition.},\n      journal   = {arXiv preprint arXiv:1607.02643},\n      year      = {2016}\n    }\n\n## Poster and Powerpoint\n* You can find a presentation for the paper [here](https://docs.google.com/presentation/d/1iHMRCghn-dOYc2knvTj8Kp27RRojCsLzCbE8Ax5JCOs/edit?usp=sharing).\n* You can find our CVPR 2016 poster [here](https://github.com/mostafa-saad/deep-activity-rec/blob/master/extra/poster.pdf).\n\n<img src=\"https://github.com/mostafa-saad/deep-activity-rec/blob/master/extra/poster.jpg\" alt=\"Poster\" height=\"400\" >\n\nMostafa on left and Srikanth on right while presenting the poster.\n"
  },
  {
    "path": "apps/exePhase1_2.cpp",
    "content": "/*\n * w-driver-volleyball-lstm-evaluator.cpp\n *\n *  Created on: Jul 13, 2015\n *      Author: msibrahi\n */\n\n#include <stdio.h>\n#include <stdlib.h>\n\n#include <iostream>\n#include <vector>\n#include <set>\n#include <map>\nusing std::vector;\nusing std::set;\nusing std::map;\nusing std::pair;\nusing std::endl;\nusing std::cout;\n\n#include \"../src/leveldb-writer.h\"\n#include \"../src/custom-macros.h\"\n#include \"../src/rect-helper.h\"\n#include \"../src/utilities.h\"\n#include \"../src/images-utilities.h\"\n#include \"../src/custom-images-macros.h\"\n#include \"../src/dlib-tracker-wrapper.h\"\n#include \"../src/volleyball-dataset-mgr.h\"\nusing MostCV::VolleyballPerson;\nusing MostCV::VolleyballVideoData;\nusing MostCV::VolleyballDatasetPart;\nusing MostCV::VolleyballDatasetMgr;\nusing MostCV::RectHelper;\n\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\n\nconst int resize_width = 256;\nconst int resize_height = 256;\nconst int num_channels = 3;\nconst int kPlayersCount = 12;\n\n/////////////////////////////////////////////////////////////////////////////////////////////////////////\n\nint main(int argc, char** argv) {\n  string program_name = MostCV::consumeStringParam(argc, argv);\n\n  cerr << \"Start: \" << program_name << endl;\n  // read program entry data\n  string dataset_videos_path = MostCV::consumeStringParam(argc, argv);\n  string config_path = MostCV::consumeStringParam(argc, argv);\n  string leveldb_output_path = MostCV::consumeStringParam(argc, argv);\n  int temporal_window = MostCV::consumeIntParam(argc, argv);\n  int step = MostCV::consumeIntParam(argc, argv);\n  int bIsPrepareLSTMData = MostCV::consumeIntParam(argc, argv); // otherwise fusion data\n\n  if (bIsPrepareLSTMData)\n    cerr << \"LSTM 1 preparation\" << endl;\n  else\n    cerr << \"Data Fusion for LSTM 2\" << endl;\n\n  assert(temporal_window > 0);\n  MostCV::fixDir(config_path);\n  MostCV::fixDir(dataset_videos_path);\n  MostCV::fixDir(leveldb_output_path);\n\n  cerr << \"Loading the dataset...\" << endl;\n  VolleyballDatasetMgr mgr(config_path, dataset_videos_path);\n\n  cerr << \"Temporal window = \" << temporal_window << \" with step = \" << step << \"\\n\\n\";\n\n  vector<Ptr<MostCV::LeveldbWriter> > dbMgrs;\n  Mat blackRectImage = Mat::zeros(resize_width, resize_height, CV_8UC3);\n\n  /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n  // Create leveldb datasets\n  for (auto &dataset : mgr.dataset_division_) {\n    dataset.dataset_db_name_ = dataset.dataset_name_ + \"-leveldb\";\n    dataset.dataset_db_path_ = leveldb_output_path + dataset.dataset_db_name_;\n\n    MostCV::fixDir(dataset.dataset_db_path_);\n\n    cerr<<\"Creating a new dataset\\n\";\n    dbMgrs.push_back(new MostCV::LeveldbWriter(dataset.dataset_db_path_, resize_height, resize_width, num_channels, false));\n\n    if (bIsPrepareLSTMData)\n      dbMgrs.back()->setLabelsRange(mgr.total_persons_labels);\n    else\n      dbMgrs.back()->setLabelsRange(mgr.total_scene_labels);\n  }\n\n  /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n  int dataset_pos = 0;\n  boost::mt19937 generator(100);\n  boost::uniform_int<> uni_dist;\n  boost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(generator, uni_dist);\n\n  for (auto dataset : mgr.dataset_division_) {\n\n    // Shuffle data before use\n    cerr << \"Extracting shuffled elements from \" << dataset.dataset_name_ << \" Data Set. Total videos = \" << dataset.videos_vec_.size() << \"\\n\";\n\n    Ptr<MostCV::LeveldbWriter> dbMgr = dbMgrs[dataset_pos++];\n    vector<pair<VolleyballVideoData, string> > database_shuffled;\n\n    for (auto video : dataset.videos_vec_) {\n      for (auto frame_id : video.annot_frame_id_vec_)\n        database_shuffled.push_back(std::make_pair(video, frame_id));\n    }\n\n    std::random_shuffle(database_shuffled.begin(), database_shuffled.end(), rand_generator);\n\n    if (bIsPrepareLSTMData) {\n      cerr << \"Total images for current data set is \" << database_shuffled.size() << \". Overall entries will be <= \"\n           << temporal_window * database_shuffled.size() * kPlayersCount << endl;\n    } else {\n      cerr << \"Total images for current data set is \" << database_shuffled.size() << \". Overall entries will be = \"\n           << temporal_window * database_shuffled.size() * kPlayersCount << endl;\n    }\n    /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n    for (auto database_entry : database_shuffled) {\n      auto video = database_entry.first;\n      string frame_id = database_entry.second;\n      int frame_label = video.annot_frame_id_to_activity_id_map_[frame_id];\n\n      // prepare tracking data\n      pair<vector<string>, vector<string> > images_paths_seq = video.GetTemporalWindowPaths(frame_id, temporal_window, step, false);\n\n      vector<Mat> imagesSequenceBefore, imagesSequenceAfter;\n      Mat img;\n\n      for (auto path : images_paths_seq.first)\n        imagesSequenceBefore.push_back(cv::imread(path));\n\n      for (auto path : images_paths_seq.second)\n        imagesSequenceAfter.push_back(cv::imread(path));\n\n      if (imagesSequenceAfter.size())\n        img = imagesSequenceAfter.back();\n      else\n        img = imagesSequenceBefore.back();\n\n      assert(!img.empty());\n\n      vector<VolleyballPerson> &persons = video.annot_frame_id_persons_map_[frame_id];\n      vector<Mat> images;\n      vector<vector<Rect> > persons_tracklets;\n\n      for (auto person : persons) {\n        MostCV::DlibTrackerWrapper tracker(person.bbox_.r);\n        pair<vector<Mat>, vector<Rect> > tracklet = tracker.Process(imagesSequenceBefore, imagesSequenceAfter);\n\n        images = tracklet.first;\n        persons_tracklets.push_back(tracklet.second);\n      }\n\n      // generates temporal_window * kPlayersCount * frames\n      int seq_id = 0, person_pos = 0;\n\n      for (auto tracklet : persons_tracklets) {\n        int rect_pos = 0;\n\n        for (auto img : images) {\n          dbMgr->clearDatum();\n          //MostCV::ShowImage(img(tracklet[rect_pos]));\n          assert(dbMgr->addImageToDatum(img(tracklet[rect_pos]), num_channels));\n\n          if (bIsPrepareLSTMData)\n            dbMgr->setDatumLabel(persons[person_pos].action_id_);\n          else\n            dbMgr->setDatumLabel(frame_label);\n\n          dbMgr->addDatumToBatch(video.video_id_ + \"_\" + frame_id + \"_\" + MostCV::toIntStr(\"000\", seq_id++));\n          rect_pos++;\n        }\n        ++person_pos;\n      }\n\n      // for missing persons, add zero images\n      if (!bIsPrepareLSTMData) {\n        LP(j, kPlayersCount - persons_tracklets.size())\n        {\n          LP(k, temporal_window)\n          {\n            dbMgr->clearDatum();\n            assert(dbMgr->addImageToDatum(blackRectImage, num_channels));\n            dbMgr->setDatumLabel(frame_label);\n            dbMgr->addDatumToBatch(video.video_id_ + \"_\" + frame_id + \"_\" + MostCV::toIntStr(\"000\", seq_id++));\n          }\n        }\n      }\n    }\n    dbMgr->forceFinalize();\n  }\n\n  cerr << \"\\n\\nBye: \" << program_name << endl;\n\n  return 0;\n}\n\n"
  },
  {
    "path": "apps/exePhase3.cpp",
    "content": "#include <stdio.h>\n#include <string>\n#include <iostream>\n#include <vector>\n#include <set>\nusing std::vector;\nusing std::set;\nusing std::string;\nusing std::pair;\nusing std::endl;\nusing std::cout;\n\n#include \"boost/algorithm/string.hpp\"\n#include \"google/protobuf/text_format.h\"\n\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\nusing caffe::Layer;\nusing caffe::LayerParameter;\nusing caffe::DataParameter;\nusing caffe::NetParameter;\nusing boost::shared_ptr;\nnamespace db = caffe::db;\n\n#include \"../src/utilities.h\"\n#include \"../src/leveldb-reader.h\"\n#include \"../src/leveldb-writer.h\"\n\nenum fuse_style {\n  concatenate_players = 0,\n  max_pool_players_1 = 1,   //  all players in one vec of feature mid size\n  max_pool_players_2 = 2,\n  max_pool_players_4 = 3,   // divide the ground 4 blocks and max pool it. E.g. in 16 players, each 4 has max pool\n  avg_pool_players_1 = 4,\n  avg_pool_players_2 = 5,\n  avg_pool_players_4 = 6,\n  sum_pool_players_1 = 7,\n  sum_pool_players_2 = 8,\n  sum_pool_players_4 = 9\n};\n\nstring fuse_style_sz[] = { \"concatenate_players\", \"max_pool_players_1\", \"max_pool_players_2\", \"max_pool_players_4\", \"avg_pool_players_1\", \"avg_pool_players_2\",\n    \"avg_pool_players_4\", \"sum_pool_players_1\", \"sum_pool_players_2\", \"sum_pool_players_4\" };\n\nint target_fuse_style = concatenate_players;\nconst int kPlayersCount = 12;\n\n\n\n\n\n\n\n\n\nvoid RemoveLastBlock(vector<float> &input, int block_length) {\n  assert((int )input.size() >= block_length);\n\n  for (int i = 0; i < block_length; ++i)\n    input.pop_back();\n}\n\nvoid AddLastBlock(vector<float> &input, int block_length) {\n\n  for (int i = 0; i < block_length; ++i)\n    input.push_back(0);\n}\n\nvoid RemoveDummyVectors(vector<float> &input, int block_length) {\n  bool is_all_zeros = true;\n\n  while (is_all_zeros && (int) input.size() > block_length) {  // Leave at least 1 block\n    int last_idx = input.size() - 1;\n    for (int i = 0; i < is_all_zeros && block_length; ++i)\n      is_all_zeros &= input[last_idx - i] == 0;\n\n    if (is_all_zeros)\n      RemoveLastBlock(input, block_length);\n  }\n}\n\n// target_blocks_cnt = 1 => merge all sub-vectors in 1 block\n// target_blocks_cnt = 4 => merge every set of consecutive sub-vectors to get total 4 blocks\nvector<float> VectorsFusing(vector<float> &input, int block_length, int target_blocks_cnt) {\n\n  // I fixed bug here...hopefully not big problem!\n\n  if (target_fuse_style == avg_pool_players_1 || target_fuse_style == sum_pool_players_1 || target_fuse_style == max_pool_players_1)\n    RemoveDummyVectors(input, block_length);\n  else if (target_fuse_style == concatenate_players) {\n    int cur_blocks = input.size() / block_length;\n\n    // then we need specific count of boxes\n    assert(cur_blocks >= kPlayersCount);\n\n    while (cur_blocks > kPlayersCount) {\n      --cur_blocks;\n      RemoveLastBlock(input, block_length);\n    }\n  } else {\n    RemoveDummyVectors(input, block_length);\n\n    while (input.size() > 0 && (input.size() % (block_length * target_blocks_cnt) != 0))\n      AddLastBlock(input, block_length);\n  }\n\n  vector<float> output;\n  const float* pData = &input[0];\n  if (input.size() % (block_length * target_blocks_cnt) != 0) {\n    cerr << \"Error A%(B*C) != 0 => \" << input.size() << \" \" << block_length << \" \" << target_blocks_cnt << \"\\n\";\n    assert(input.size() % (block_length * target_blocks_cnt) == 0);\n  }\n  int merge_blocks_cnt = input.size() / (block_length * target_blocks_cnt);  // merge cnt\n\n  for (int i = 0; i < (int) input.size(); i += block_length * merge_blocks_cnt) {\n    int t = merge_blocks_cnt;\n\n    vector<float> sub_output(block_length);\n\n    for (int j = 0; j < block_length; ++j)\n      sub_output[j] = pData[j];\n\n    pData += block_length;\n    --t;\n\n    while (t--) {\n      for (int j = 0; j < block_length; ++j) {\n        if (target_fuse_style == avg_pool_players_1 || target_fuse_style == avg_pool_players_2 || target_fuse_style == avg_pool_players_4\n            || target_fuse_style == sum_pool_players_1|| target_fuse_style == sum_pool_players_2|| target_fuse_style == sum_pool_players_4)\n          sub_output[j] += pData[j];\n        else\n          sub_output[j] = std::max(sub_output[j], pData[j]);\n      }\n\n      pData += block_length;\n    }\n\n    for (auto val : sub_output)\n      output.push_back(val);\n  }\n\n  if (target_fuse_style == avg_pool_players_1 || target_fuse_style == avg_pool_players_2 || target_fuse_style == avg_pool_players_4) {\n    for (auto &val : output)\n      val /= merge_blocks_cnt;\n  }\n\n  return output;\n}\n\n\n\ntemplate<typename Dtype>\nvoid feature_extraction_pipeline(int &argc, char** &argv) {\n\n  target_fuse_style = MostCV::consumeIntParam(argc, argv, \"target_fuse_style\");\n  LOG(ERROR)<< \"Fusing style = \"<<fuse_style_sz[target_fuse_style] <<\"\\n\\n\";\n\n  int frames_window = MostCV::consumeIntParam(argc, argv, \"frames_window\");\n  LOG(ERROR)<< \"frames_window = \" << frames_window;\n\n  LOG(ERROR)<< \"Expected batch size = \"<<kPlayersCount * frames_window;\n\n  string computation_mode = MostCV::consumeStringParam(argc, argv);\n\n  if (strcmp(computation_mode.c_str(), \"GPU\") == 0) {\n    uint device_id = MostCV::consumeIntParam(argc, argv, \"device_id\");\n\n    LOG(ERROR)<< \"Using GPU\";\n    LOG(ERROR)<< \"Using Device_id=\" << device_id;\n\n    Caffe::SetDevice(device_id);\n    Caffe::set_mode(Caffe::GPU);\n  } else {\n    LOG(ERROR)<< \"Using CPU\";\n    Caffe::set_mode(Caffe::CPU);\n  }\n\n  string pretrained_binary_proto(MostCV::consumeStringParam(argc, argv));\n  string feature_extraction_proto(MostCV::consumeStringParam(argc, argv));\n\n  LOG(ERROR)<<\"Model: \"<<pretrained_binary_proto;\n  LOG(ERROR)<<\"Proto: \"<<feature_extraction_proto;\n\n  LOG(ERROR)<<\"Creating the test network\\n\";\n  shared_ptr<Net<Dtype> > feature_extraction_net(new Net<Dtype>(feature_extraction_proto, caffe::Phase::TEST));\n\n  LOG(ERROR)<<\"Loading the Model\\n\";\n  feature_extraction_net->CopyTrainedLayersFrom(pretrained_binary_proto);\n\n  vector<string> blob_names_vec;\n\n  int blobs_cnt = MostCV::consumeIntParam(argc, argv, \"blobs_cnt\");\n\n  assert(blobs_cnt > 0);\n\n  LOG(ERROR)<<\"# of blobs is \"<<blobs_cnt;\n\n  LP(i, blobs_cnt)\n  {\n    string blob_name = MostCV::consumeStringParam(argc, argv);\n\n    LOG(ERROR)<<\"blob_name: \"<<blob_name;\n\n    CHECK(feature_extraction_net->has_blob(blob_name)) << \"Unknown feature blob name \" << blob_name << \" in the network \" << feature_extraction_proto;\n\n    blob_names_vec.push_back(blob_name);\n  }\n\n  string output_dataset_name = MostCV::consumeStringParam(argc, argv);\n\n  int num_mini_batches = MostCV::consumeIntParam(argc, argv, \"num_mini_batches\");\n\n  LOG(ERROR)<<\"num_mini_batches: \"<<num_mini_batches;\n\n  MostCV::LeveldbWriter leveldbWriter(output_dataset_name);\n\n  Datum datum;\n  const int kMaxKeyStrLength = 100;\n  char key_str[kMaxKeyStrLength];\n  vector<Blob<float>*> input_vec;\n  int db_entry_idx = 0;\n  int batch_size = -1;\n  int dim_features = -1;\n\n  std::set<int> batch_labels;        // all our batch value must be same\n  std::set<int> dataset_labels;   // logically database shouldn't have only 1 label\n\n  for (int batch_index = 0; batch_index < num_mini_batches; ++batch_index) {  // e.g. 100 iterations. Probably roll on data if needed\n    feature_extraction_net->Forward(input_vec);  // Take one batch of data (e.g. 50 images), and pass them to end of network\n\n    // Load the Labels\n    const shared_ptr<Blob<Dtype> > label_blob = feature_extraction_net->blob_by_name(\"label\");\n\n    batch_size = label_blob->num();                   // e.g. 16 batches for volleyball..represents the boxes of a frame.\n\n    assert(batch_size == frames_window * kPlayersCount);\n\n    batch_labels.clear();\n    int current_label = -1;\n\n    for (int n = 0; n < batch_size; ++n) {\n      const Dtype* label_blob_data = label_blob->cpu_data() + label_blob->offset(n);  // move offset to ith blob in batch\n      current_label = label_blob_data[0];  // all will be same value\n      batch_labels.insert(current_label);\n      dataset_labels.insert(current_label);\n    }\n\n    if (batch_labels.size() != 1) {  // every 1 batch should have same value\n      cerr << \"\\n\\nERROR. Every 1 batch should have the same value. Inconsistent batch # \" << batch_index + 1 << \"-th\\n\";\n      cerr << \"Overall unique labels are: \" << batch_labels.size() << \". The appeared labels are: \";\n\n      for(auto label : batch_labels)\n        cerr<<label<<\" \";\n      cerr<<\"\\n\";\n      assert(false);\n    }\n    vector<shared_ptr<Blob<Dtype> > > feature_blob_vec;\n\n    for (auto blob_name : blob_names_vec) {\n      shared_ptr<Blob<Dtype> > feature_blob = feature_extraction_net->blob_by_name(blob_name);  // get e.g. fc7 blob for the batch\n      feature_blob_vec.push_back(feature_blob);\n    }\n\n    int total_dim_features = 0;\n    static bool print_once_feature_vec = true;\n\n    if (print_once_feature_vec)\n      LOG(ERROR)<<\"\\n\\n\";\n\n    for (auto feature_blob : feature_blob_vec) {\n      dim_features = feature_blob->count() / batch_size;  // e.g. 4096\n      total_dim_features += dim_features;                 // e.g. 4096 of fc7 + 250 of lstm1\n\n      if (print_once_feature_vec)\n        LOG(ERROR)<<\"ith Vector Length = \"<<dim_features;\n      }\n\n    vector<vector<float> > window_feature_vecs(frames_window);\n\n    for (int n = 0; n < batch_size; ++n) {\n      for (auto feature_blob : feature_blob_vec) {\n        dim_features = feature_blob->count() / batch_size;  // e.g. 4096\n\n        const Dtype* feature_blob_data = feature_blob->cpu_data() + feature_blob->offset(n);  // move offset to ith blob in batch\n\n        int p = n % frames_window;\n        for (int d = 0; d < dim_features; ++d)\n          window_feature_vecs[p].push_back(feature_blob_data[d]);\n      }\n    }\n    for (auto &feature_vec : window_feature_vecs) {\n      if (target_fuse_style == max_pool_players_1 || target_fuse_style == avg_pool_players_1 || target_fuse_style == sum_pool_players_1)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 1);\n      else if (target_fuse_style == max_pool_players_2 || target_fuse_style == avg_pool_players_2 || target_fuse_style == sum_pool_players_2)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 2);\n      else if (target_fuse_style == max_pool_players_4 || target_fuse_style == avg_pool_players_4 || target_fuse_style == sum_pool_players_4)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 4);\n      // otherwise, keep it concatenated\n\n      if (print_once_feature_vec)\n        LOG(ERROR)<<\"Fused Vector Length = \"<<feature_vec.size();\n\n      print_once_feature_vec = false;\n\n      datum.set_width(1);\n      datum.set_channels(1);\n      datum.clear_data();\n      datum.clear_float_data();\n      datum.set_height(feature_vec.size());\n\n      for (int p = 0; p < (int) feature_vec.size(); ++p)\n        datum.add_float_data(feature_vec[p]);\n\n      int length = snprintf(key_str, kMaxKeyStrLength, \"%010d\", db_entry_idx);  // \"%010d\" BUG fix\n      leveldbWriter.addDatumToBatch(datum, string(key_str, length), current_label);\n      ++db_entry_idx;\n      feature_vec.clear();\n    }\n  }\n  leveldbWriter.forceFinalize();\n\n  assert(dataset_labels.size() > 1);  // some variety make sense!\n}\n\n\n\n\n\n\n\n\nint main(int argc, char** argv) {\n  ::google::InitGoogleLogging(argv[0]);\n  MostCV::consumeStringParam(argc, argv);  // read program entry data\n\n  LOG(ERROR)<< \"Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\\n\\n\";\n\n  // as long as chucks of data\n  while (argc) {\n    if (argc < 6) {\n      LOG(ERROR)<< \"At least 6 parameters expected\\n\";\n      assert(false);\n    }\n\n    feature_extraction_pipeline<float>(argc, argv);\n    LOG(ERROR)<< \"\\n\\nSuccessfully extracted the features!\\n\\n\";\n  }\n\n  return 0;\n}\n"
  },
  {
    "path": "apps/exePhase4.cpp",
    "content": "/*\n * w-driver-volleyball-lstm-evaluator.cpp\n *\n *  Created on: Jul 13, 2015\n *      Author: msibrahi\n */\n\n#include <iostream>\n#include <vector>\n#include <stdio.h>\n#include <string>\n#include <set>\n#include <set>\n#include <map>\n#include <iomanip>\nusing std::vector;\nusing std::set;\nusing std::multiset;\nusing std::map;\nusing std::pair;\nusing std::string;\nusing std::endl;\nusing std::cerr;\n\n#include \"boost/algorithm/string.hpp\"\n#include \"google/protobuf/text_format.h\"\n\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\nusing caffe::Layer;\nusing caffe::LayerParameter;\nusing caffe::DataParameter;\nusing caffe::NetParameter;\nusing boost::shared_ptr;\nnamespace db = caffe::db;\n\n#include \"../src/utilities.h\"\n#include \"../src/leveldb-reader.h\"\n\nvoid evaluate(vector<int> truthLabels, vector<int> resultLabels, int w) {\n\n  set<int> total_labels;\n  map<int, map<int, int> > confusion_freq_maps;\n  map<int, int> label_freq;\n  int correct = 0;\n\n  cerr<<\"\\n\\n\";\n  for (int i = 0; i < (int) truthLabels.size(); ++i) {\n    correct += truthLabels[i] == resultLabels[i];\n\n    cerr << \"Test \" << i + 1 << \": Result = \" << resultLabels[i] << \" GroundTruth = \" << truthLabels[i] << \"\\n\";\n\n    confusion_freq_maps[truthLabels[i]][resultLabels[i]]++;\n    total_labels.insert(truthLabels[i]);\n    total_labels.insert(resultLabels[i]);\n    label_freq[truthLabels[i]]++;\n  }\n\n  cerr.setf(std::ios::fixed);\n  cerr.precision(2);\n\n  cerr<<\"\\n\\n\";\n  cerr << \"Total testing frames: \" << truthLabels.size() << \" with temporal window: \" << w << \"\\n\";\n  cerr << \"Temporal accuracy : \" << 100.0 * correct / truthLabels.size() << \" %\\n\";\n  cerr << \"\\n=======================================================================================\\n\";\n\n  cerr << \"\\nConfusion Matrix - Truth (col) / Result(row)\\n\\n\";\n\n  cerr << std::setw(5) << \"T/R\" << \": \";\n\n  for (auto r_label : total_labels)\n    cerr << std::setw(5) << r_label;\n  cerr << \"\\n=======================================================================================\\n\";\n\n  for (auto t_label : total_labels) {\n    int sum = 0;\n    cerr << std::setw(5) << t_label << \": \";\n\n    for (auto r_label : total_labels)\n    {\n      cerr << std::setw(5) << confusion_freq_maps[t_label][r_label];\n\n      sum += confusion_freq_maps[t_label][r_label];\n    }\n\n    double percent = 0;\n\n    if (label_freq[t_label] > 0)\n      percent = 100.0 * confusion_freq_maps[t_label][t_label] / label_freq[t_label];\n\n    cerr << \" \\t=> Total Correct = \" << std::setw(5) << confusion_freq_maps[t_label][t_label] << \" / \" << std::setw(5) << sum << \" = \" << percent << \" %\\n\";\n  }\n\n\n  cerr<<\"\\n\\n\";\n  cerr << std::setw(7) << \"T/R\" << \": \";\n\n  for (auto r_label : total_labels)\n    cerr << std::setw(7) << r_label;\n  cerr << \"\\n=======================================================================================\\n\";\n\n  for (auto t_label : total_labels) {\n    cerr << std::setw(7) << t_label << \": \";\n\n    for (auto r_label : total_labels)\n    {\n      double percent = 0;\n\n      if (label_freq[t_label] > 0)\n        percent = 100.0 * confusion_freq_maps[t_label][r_label] / label_freq[t_label];\n\n      cerr << std::setw(7) << percent;\n    }\n    cerr<<\"\\n\";\n  }\n\n  cerr<<\"\\nTo get labels corresponding to IDs..see dataset loading logs\\n\";\n}\n\nint getArgmax(vector<float> &v) {\n  int pos = 0;\n\n  assert(v.size() > 0);\n\n  for (int j = 1; j < (int) v.size(); ++j) {\n    if (v[j] > v[pos])\n      pos = j;\n  }\n  return pos;\n}\n\ntemplate<typename Dtype>\nvoid feature_extraction_pipeline(int &argc, char** &argv) {\n\n  int frames_window = MostCV::consumeIntParam(argc, argv);\n  LOG(ERROR)<< \"Temporal Window = \" << frames_window;\n\n  string computation_mode = MostCV::consumeStringParam(argc, argv);\n\n  if (strcmp(computation_mode.c_str(), \"GPU\") == 0) {\n    uint device_id = MostCV::consumeIntParam(argc, argv);\n\n    LOG(ERROR)<< \"Using GPU\";\n    LOG(ERROR)<< \"Using Device_id = \" << device_id;\n\n    Caffe::SetDevice(device_id);\n    Caffe::set_mode(Caffe::GPU);\n  } else {\n    LOG(ERROR)<< \"Using CPU\";\n    Caffe::set_mode(Caffe::CPU);\n  }\n\n  string pretrained_binary_proto(MostCV::consumeStringParam(argc, argv));\n  string feature_extraction_proto(MostCV::consumeStringParam(argc, argv));\n\n  LOG(ERROR)<<\"Model: \"<<pretrained_binary_proto<<\"\\n\";\n  LOG(ERROR)<<\"Proto: \"<<feature_extraction_proto<<\"\\n\";\n\n  LOG(ERROR)<<\"Creating the test network\\n\";\n  shared_ptr<Net<Dtype> > feature_extraction_net(new Net<Dtype>(feature_extraction_proto, caffe::Phase::TEST));\n\n  LOG(ERROR)<<\"Loading the Model\\n\";\n  feature_extraction_net->CopyTrainedLayersFrom(pretrained_binary_proto);\n\n  string blob_name = MostCV::consumeStringParam(argc, argv);\n  LOG(ERROR)<<\"blob_name: \"<<blob_name<<\"\\n\";\n\n  CHECK(feature_extraction_net->has_blob(blob_name)) << \"Unknown feature blob name \" << blob_name << \" in the network \" << feature_extraction_proto;\n\n  int num_mini_batches = MostCV::consumeIntParam(argc, argv);\n  LOG(ERROR)<<\"num_mini_batches: \"<<num_mini_batches<<\"\\n\";\n\n  vector<Blob<float>*> input_vec;\n  int batch_size = -1;\n  int dim_features = -1;\n  std::set<int> labels;       // every (2w+1) * batch size MUST all have same label\n\n  vector<int> truthLabels;\n  vector<int> propAvgMaxResultLabels;\n\n  for (int batch_index = 0; batch_index < num_mini_batches; ++batch_index) {  // e.g. 100 iterations. Probably roll on data if needed\n    feature_extraction_net->Forward(input_vec);  // Take one batch of data (e.g. 50 images), and pass them to end of network\n\n    // Load the Labels\n    const shared_ptr<Blob<Dtype> > label_blob = feature_extraction_net->blob_by_name(\"label\");\n\n    batch_size = label_blob->num();                   // e.g. 50 batches\n\n    assert(batch_size == frames_window);\n\n    int current_label = -1;\n    for (int n = 0; n < batch_size; ++n) {\n      const Dtype* label_blob_data = label_blob->cpu_data() + label_blob->offset(n);  // move offset to ith blob in batch\n      current_label = label_blob_data[0];  // all will be same value\n      labels.insert(current_label);\n\n      if (n == 0)\n        truthLabels.push_back(current_label);\n    }\n\n    if (labels.size() != 1) {  // every 1 batch should have same value\n      LOG(ERROR)<< \"Something wrong. every 1 batch should have same value. New value at element \" << batch_index + 1 << \"\\n\";\n      assert(false);\n    }\n    labels.clear();\n\n    const shared_ptr<Blob<Dtype> > feature_blob = feature_extraction_net->blob_by_name(blob_name);  // get e.g. fc7 blob for the batch\n\n    dim_features = feature_blob->count() / batch_size;\n    assert(dim_features > 1);\n\n    const Dtype* feature_blob_data = nullptr;\n\n    vector<float> test_case_sum(dim_features);\n\n    for (int n = 0; n < batch_size; ++n) {\n      feature_blob_data = feature_blob->cpu_data() + feature_blob->offset(n);  // move offset to ith blob in batch\n\n      vector<float> test_case;\n      for (int j = 0; j < dim_features; ++j) {\n        test_case.push_back(feature_blob_data[j]);\n\n        test_case_sum[j] += feature_blob_data[j];\n      }\n    }\n\n    propAvgMaxResultLabels.push_back( getArgmax(test_case_sum) );\n  }\n\n  evaluate(truthLabels, propAvgMaxResultLabels, 1);\n}\n\nint main(int argc, char** argv) {\n  ::google::InitGoogleLogging(argv[0]);\n  MostCV::consumeStringParam(argc, argv);  // read program entry data\n\n  if (argc < 6) {\n    LOG(ERROR)<< \"At least 6 parameters expected\\n\";\n    assert(false);\n  }\n\n  LOG(ERROR)<< \"Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\\n\\n\";\n\n  feature_extraction_pipeline<float>(argc, argv);\n\n  return 0;\n}\n"
  },
  {
    "path": "dataset-config/test.txt",
    "content": "4\n5\n9\n11\n14\n20\n21\n25\n29\n34\n35\n37\n43\n44\n45\n47\n"
  },
  {
    "path": "dataset-config/train.txt",
    "content": ""
  },
  {
    "path": "dataset-config/trainval.txt",
    "content": "0\n1\n2\n3\n6\n7\n8\n10\n12\n13\n15\n16\n17\n18\n19\n22\n23\n24\n26\n27\n28\n30\n31\n32\n33\n36\n38\n39\n40\n41\n42\n46\n48\n49\n50\n51\n52\n53\n54\n"
  },
  {
    "path": "dataset-config/val.txt",
    "content": ""
  },
  {
    "path": "dataset-config-simple/test.txt",
    "content": "41\n"
  },
  {
    "path": "dataset-config-simple/train.txt",
    "content": ""
  },
  {
    "path": "dataset-config-simple/trainval.txt",
    "content": "39\n"
  },
  {
    "path": "dataset-config-simple/val.txt",
    "content": ""
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/.cproject",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n<?fileVersion 4.0.0?><cproject storage_type_id=\"org.eclipse.cdt.core.XmlProjectDescriptionStorage\">\n\t<storageModule moduleId=\"org.eclipse.cdt.core.settings\">\n\t\t<cconfiguration id=\"cdt.managedbuild.config.gnu.exe.debug.1594868632\">\n\t\t\t<storageModule buildSystemId=\"org.eclipse.cdt.managedbuilder.core.configurationDataProvider\" id=\"cdt.managedbuild.config.gnu.exe.debug.1594868632\" moduleId=\"org.eclipse.cdt.core.settings\" name=\"Debug\">\n\t\t\t\t<externalSettings/>\n\t\t\t\t<extensions>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.ELF\" point=\"org.eclipse.cdt.core.BinaryParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GASErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GmakeErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GLDErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.CWDLocator\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GCCErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t</extensions>\n\t\t\t</storageModule>\n\t\t\t<storageModule moduleId=\"cdtBuildSystem\" version=\"4.0.0\">\n\t\t\t\t<configuration artifactName=\"${ProjName}\" buildArtefactType=\"org.eclipse.cdt.build.core.buildArtefactType.exe\" buildProperties=\"org.eclipse.cdt.build.core.buildArtefactType=org.eclipse.cdt.build.core.buildArtefactType.exe,org.eclipse.cdt.build.core.buildType=org.eclipse.cdt.build.core.buildType.debug\" cleanCommand=\"rm -rf\" description=\"\" id=\"cdt.managedbuild.config.gnu.exe.debug.1594868632\" name=\"Debug\" parent=\"cdt.managedbuild.config.gnu.exe.debug\">\n\t\t\t\t\t<folderInfo id=\"cdt.managedbuild.config.gnu.exe.debug.1594868632.\" name=\"/\" resourcePath=\"\">\n\t\t\t\t\t\t<toolChain id=\"cdt.managedbuild.toolchain.gnu.exe.debug.1840343081\" name=\"Linux GCC\" superClass=\"cdt.managedbuild.toolchain.gnu.exe.debug\">\n\t\t\t\t\t\t\t<targetPlatform id=\"cdt.managedbuild.target.gnu.platform.exe.debug.256486191\" name=\"Debug Platform\" superClass=\"cdt.managedbuild.target.gnu.platform.exe.debug\"/>\n\t\t\t\t\t\t\t<builder buildPath=\"${workspace_loc:/ibrahim16-deep-act-rec-full}/Debug\" enableAutoBuild=\"true\" id=\"cdt.managedbuild.target.gnu.builder.exe.debug.345619827\" keepEnvironmentInBuildfile=\"false\" managedBuildOn=\"true\" name=\"Gnu Make Builder\" superClass=\"cdt.managedbuild.target.gnu.builder.exe.debug\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.archiver.base.363346067\" name=\"GCC Archiver\" superClass=\"cdt.managedbuild.tool.gnu.archiver.base\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.cpp.compiler.exe.debug.413146304\" name=\"GCC C++ Compiler\" superClass=\"cdt.managedbuild.tool.gnu.cpp.compiler.exe.debug\">\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.exe.debug.option.optimization.level.99965190\" name=\"Optimization Level\" superClass=\"gnu.cpp.compiler.exe.debug.option.optimization.level\" value=\"gnu.cpp.compiler.optimization.level.none\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.exe.debug.option.debugging.level.1793442040\" name=\"Debug Level\" superClass=\"gnu.cpp.compiler.exe.debug.option.debugging.level\" value=\"gnu.cpp.compiler.debugging.level.max\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.option.include.paths.576023052\" name=\"Include paths (-I)\" superClass=\"gnu.cpp.compiler.option.include.paths\" valueType=\"includePath\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/include/openblas\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/software/dlib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/cuda-6.5/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.option.other.other.566250675\" name=\"Other flags\" superClass=\"gnu.cpp.compiler.option.other.other\" value=\"-c -fmessage-length=0 -std=c++0x\" valueType=\"string\"/>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.cpp.compiler.input.157465269\" superClass=\"cdt.managedbuild.tool.gnu.cpp.compiler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.c.compiler.exe.debug.1589680847\" name=\"GCC C Compiler\" superClass=\"cdt.managedbuild.tool.gnu.c.compiler.exe.debug\">\n\t\t\t\t\t\t\t\t<option defaultValue=\"gnu.c.optimization.level.none\" id=\"gnu.c.compiler.exe.debug.option.optimization.level.992112738\" name=\"Optimization Level\" superClass=\"gnu.c.compiler.exe.debug.option.optimization.level\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.c.compiler.exe.debug.option.debugging.level.930130026\" name=\"Debug Level\" superClass=\"gnu.c.compiler.exe.debug.option.debugging.level\" value=\"gnu.c.debugging.level.max\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.c.compiler.input.935152630\" superClass=\"cdt.managedbuild.tool.gnu.c.compiler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.c.linker.exe.debug.898845305\" name=\"GCC C Linker\" superClass=\"cdt.managedbuild.tool.gnu.c.linker.exe.debug\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.cpp.linker.exe.debug.1535101285\" name=\"GCC C++ Linker\" superClass=\"cdt.managedbuild.tool.gnu.cpp.linker.exe.debug\">\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.libs.1390119073\" name=\"Libraries (-l)\" superClass=\"gnu.cpp.link.option.libs\" valueType=\"libs\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"protobuf\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_core\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_highgui\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_imgproc\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_ml\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"leveldb\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"gflags\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"leveldb\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"glog\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_system\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_filesystem\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_chrono\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_python\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"python2.7\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"caffe\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.paths.1032436377\" name=\"Library search path (-L)\" superClass=\"gnu.cpp.link.option.paths\" valueType=\"libPaths\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/build/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/cuda-6.5/lib64\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/lib\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.userobjs.1707235531\" name=\"Other objects\" superClass=\"gnu.cpp.link.option.userobjs\" valueType=\"userObjs\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/software/dlib/examples/build/dlib_build/libdlib.a\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.cpp.linker.input.1819208667\" superClass=\"cdt.managedbuild.tool.gnu.cpp.linker.input\">\n\t\t\t\t\t\t\t\t\t<additionalInput kind=\"additionalinputdependency\" paths=\"$(USER_OBJS)\"/>\n\t\t\t\t\t\t\t\t\t<additionalInput kind=\"additionalinput\" paths=\"$(LIBS)\"/>\n\t\t\t\t\t\t\t\t</inputType>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.assembler.exe.debug.1771318469\" name=\"GCC Assembler\" superClass=\"cdt.managedbuild.tool.gnu.assembler.exe.debug\">\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.assembler.input.1053948313\" superClass=\"cdt.managedbuild.tool.gnu.assembler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t</toolChain>\n\t\t\t\t\t</folderInfo>\n\t\t\t\t\t<sourceEntries>\n\t\t\t\t\t\t<entry excluding=\"apps/exePhase4.cpp|apps/exePhase1_2.cpp\" flags=\"VALUE_WORKSPACE_PATH|RESOLVED\" kind=\"sourcePath\" name=\"\"/>\n\t\t\t\t\t</sourceEntries>\n\t\t\t\t</configuration>\n\t\t\t</storageModule>\n\t\t\t<storageModule moduleId=\"org.eclipse.cdt.core.externalSettings\"/>\n\t\t</cconfiguration>\n\t\t<cconfiguration id=\"cdt.managedbuild.config.gnu.exe.release.866313611\">\n\t\t\t<storageModule buildSystemId=\"org.eclipse.cdt.managedbuilder.core.configurationDataProvider\" id=\"cdt.managedbuild.config.gnu.exe.release.866313611\" moduleId=\"org.eclipse.cdt.core.settings\" name=\"Release\">\n\t\t\t\t<externalSettings/>\n\t\t\t\t<extensions>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.ELF\" point=\"org.eclipse.cdt.core.BinaryParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GASErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GmakeErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GLDErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.CWDLocator\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t\t<extension id=\"org.eclipse.cdt.core.GCCErrorParser\" point=\"org.eclipse.cdt.core.ErrorParser\"/>\n\t\t\t\t</extensions>\n\t\t\t</storageModule>\n\t\t\t<storageModule moduleId=\"cdtBuildSystem\" version=\"4.0.0\">\n\t\t\t\t<configuration artifactName=\"${ProjName}\" buildArtefactType=\"org.eclipse.cdt.build.core.buildArtefactType.exe\" buildProperties=\"org.eclipse.cdt.build.core.buildArtefactType=org.eclipse.cdt.build.core.buildArtefactType.exe,org.eclipse.cdt.build.core.buildType=org.eclipse.cdt.build.core.buildType.release\" cleanCommand=\"rm -rf\" description=\"\" id=\"cdt.managedbuild.config.gnu.exe.release.866313611\" name=\"Release\" parent=\"cdt.managedbuild.config.gnu.exe.release\">\n\t\t\t\t\t<folderInfo id=\"cdt.managedbuild.config.gnu.exe.release.866313611.\" name=\"/\" resourcePath=\"\">\n\t\t\t\t\t\t<toolChain id=\"cdt.managedbuild.toolchain.gnu.exe.release.1253098200\" name=\"Linux GCC\" superClass=\"cdt.managedbuild.toolchain.gnu.exe.release\">\n\t\t\t\t\t\t\t<targetPlatform id=\"cdt.managedbuild.target.gnu.platform.exe.release.79855836\" name=\"Debug Platform\" superClass=\"cdt.managedbuild.target.gnu.platform.exe.release\"/>\n\t\t\t\t\t\t\t<builder buildPath=\"${workspace_loc:/ibrahim16-deep-act-rec-full}/Release\" enableAutoBuild=\"true\" id=\"cdt.managedbuild.target.gnu.builder.exe.release.1235806736\" keepEnvironmentInBuildfile=\"false\" managedBuildOn=\"true\" name=\"Gnu Make Builder\" parallelBuildOn=\"true\" parallelizationNumber=\"4\" superClass=\"cdt.managedbuild.target.gnu.builder.exe.release\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.archiver.base.1764999849\" name=\"GCC Archiver\" superClass=\"cdt.managedbuild.tool.gnu.archiver.base\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.cpp.compiler.exe.release.1469699233\" name=\"GCC C++ Compiler\" superClass=\"cdt.managedbuild.tool.gnu.cpp.compiler.exe.release\">\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.exe.release.option.optimization.level.114010253\" name=\"Optimization Level\" superClass=\"gnu.cpp.compiler.exe.release.option.optimization.level\" value=\"gnu.cpp.compiler.optimization.level.most\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.exe.release.option.debugging.level.1590442454\" name=\"Debug Level\" superClass=\"gnu.cpp.compiler.exe.release.option.debugging.level\" value=\"gnu.cpp.compiler.debugging.level.none\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.option.other.other.1802321637\" name=\"Other flags\" superClass=\"gnu.cpp.compiler.option.other.other\" value=\"-c -fmessage-length=0 -std=c++0x\" valueType=\"string\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.compiler.option.include.paths.443797051\" name=\"Include paths (-I)\" superClass=\"gnu.cpp.compiler.option.include.paths\" valueType=\"includePath\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/include/openblas\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/software/dlib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/cuda-6.5/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.cpp.compiler.input.1753181949\" superClass=\"cdt.managedbuild.tool.gnu.cpp.compiler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.c.compiler.exe.release.1612447599\" name=\"GCC C Compiler\" superClass=\"cdt.managedbuild.tool.gnu.c.compiler.exe.release\">\n\t\t\t\t\t\t\t\t<option defaultValue=\"gnu.c.optimization.level.most\" id=\"gnu.c.compiler.exe.release.option.optimization.level.768981263\" name=\"Optimization Level\" superClass=\"gnu.c.compiler.exe.release.option.optimization.level\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<option id=\"gnu.c.compiler.exe.release.option.debugging.level.696644383\" name=\"Debug Level\" superClass=\"gnu.c.compiler.exe.release.option.debugging.level\" value=\"gnu.c.debugging.level.none\" valueType=\"enumerated\"/>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.c.compiler.input.627752218\" superClass=\"cdt.managedbuild.tool.gnu.c.compiler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.c.linker.exe.release.749598668\" name=\"GCC C Linker\" superClass=\"cdt.managedbuild.tool.gnu.c.linker.exe.release\"/>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.cpp.linker.exe.release.8951506\" name=\"GCC C++ Linker\" superClass=\"cdt.managedbuild.tool.gnu.cpp.linker.exe.release\">\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.libs.581056802\" name=\"Libraries (-l)\" superClass=\"gnu.cpp.link.option.libs\" valueType=\"libs\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"protobuf\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_core\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_highgui\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_imgproc\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"opencv_ml\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"gflags\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"leveldb\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"glog\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_system\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_filesystem\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_chrono\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"boost_python\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"python2.7\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"caffe\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.paths.229833833\" name=\"Library search path (-L)\" superClass=\"gnu.cpp.link.option.paths\" valueType=\"libPaths\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/caffe-lstm/build/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/usr/local/cuda-6.5/lib64\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/lib\"/>\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/lib\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<option id=\"gnu.cpp.link.option.userobjs.620271518\" name=\"Other objects\" superClass=\"gnu.cpp.link.option.userobjs\" valueType=\"userObjs\">\n\t\t\t\t\t\t\t\t\t<listOptionValue builtIn=\"false\" value=\"/cs/vml2/msibrahi/workspaces/software/dlib/examples/build/dlib_build/libdlib.a\"/>\n\t\t\t\t\t\t\t\t</option>\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.cpp.linker.input.38237156\" superClass=\"cdt.managedbuild.tool.gnu.cpp.linker.input\">\n\t\t\t\t\t\t\t\t\t<additionalInput kind=\"additionalinputdependency\" paths=\"$(USER_OBJS)\"/>\n\t\t\t\t\t\t\t\t\t<additionalInput kind=\"additionalinput\" paths=\"$(LIBS)\"/>\n\t\t\t\t\t\t\t\t</inputType>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t\t<tool id=\"cdt.managedbuild.tool.gnu.assembler.exe.release.1301567985\" name=\"GCC Assembler\" superClass=\"cdt.managedbuild.tool.gnu.assembler.exe.release\">\n\t\t\t\t\t\t\t\t<inputType id=\"cdt.managedbuild.tool.gnu.assembler.input.1098904455\" superClass=\"cdt.managedbuild.tool.gnu.assembler.input\"/>\n\t\t\t\t\t\t\t</tool>\n\t\t\t\t\t\t</toolChain>\n\t\t\t\t\t</folderInfo>\n\t\t\t\t\t<sourceEntries>\n\t\t\t\t\t\t<entry excluding=\"apps/exePhase4.cpp|apps/exePhase1_2.cpp\" flags=\"VALUE_WORKSPACE_PATH|RESOLVED\" kind=\"sourcePath\" name=\"\"/>\n\t\t\t\t\t</sourceEntries>\n\t\t\t\t</configuration>\n\t\t\t</storageModule>\n\t\t\t<storageModule moduleId=\"org.eclipse.cdt.core.externalSettings\"/>\n\t\t</cconfiguration>\n\t</storageModule>\n\t<storageModule moduleId=\"cdtBuildSystem\" version=\"4.0.0\">\n\t\t<project id=\"ibrahim16-deep-act-rec-full.cdt.managedbuild.target.gnu.exe.1172078451\" name=\"Executable\" projectType=\"cdt.managedbuild.target.gnu.exe\"/>\n\t</storageModule>\n\t<storageModule moduleId=\"scannerConfiguration\">\n\t\t<autodiscovery enabled=\"true\" problemReportingEnabled=\"true\" selectedProfileId=\"\"/>\n\t\t<scannerConfigBuildInfo instanceId=\"cdt.managedbuild.config.gnu.exe.release.866313611;cdt.managedbuild.config.gnu.exe.release.866313611.;cdt.managedbuild.tool.gnu.c.compiler.exe.release.1612447599;cdt.managedbuild.tool.gnu.c.compiler.input.627752218\">\n\t\t\t<autodiscovery enabled=\"true\" problemReportingEnabled=\"true\" selectedProfileId=\"\"/>\n\t\t</scannerConfigBuildInfo>\n\t\t<scannerConfigBuildInfo instanceId=\"cdt.managedbuild.config.gnu.exe.release.866313611;cdt.managedbuild.config.gnu.exe.release.866313611.;cdt.managedbuild.tool.gnu.cpp.compiler.exe.release.1469699233;cdt.managedbuild.tool.gnu.cpp.compiler.input.1753181949\">\n\t\t\t<autodiscovery enabled=\"true\" problemReportingEnabled=\"true\" selectedProfileId=\"\"/>\n\t\t</scannerConfigBuildInfo>\n\t\t<scannerConfigBuildInfo instanceId=\"cdt.managedbuild.config.gnu.exe.debug.1594868632;cdt.managedbuild.config.gnu.exe.debug.1594868632.;cdt.managedbuild.tool.gnu.c.compiler.exe.debug.1589680847;cdt.managedbuild.tool.gnu.c.compiler.input.935152630\">\n\t\t\t<autodiscovery enabled=\"true\" problemReportingEnabled=\"true\" selectedProfileId=\"\"/>\n\t\t</scannerConfigBuildInfo>\n\t\t<scannerConfigBuildInfo instanceId=\"cdt.managedbuild.config.gnu.exe.debug.1594868632;cdt.managedbuild.config.gnu.exe.debug.1594868632.;cdt.managedbuild.tool.gnu.cpp.compiler.exe.debug.413146304;cdt.managedbuild.tool.gnu.cpp.compiler.input.157465269\">\n\t\t\t<autodiscovery enabled=\"true\" problemReportingEnabled=\"true\" selectedProfileId=\"\"/>\n\t\t</scannerConfigBuildInfo>\n\t</storageModule>\n\t<storageModule moduleId=\"org.eclipse.cdt.core.LanguageSettingsProviders\"/>\n\t<storageModule moduleId=\"refreshScope\" versionNumber=\"2\">\n\t\t<configuration configurationName=\"Debug\">\n\t\t\t<resource resourceType=\"PROJECT\" workspacePath=\"/ibrahim16-deep-act-rec-full\"/>\n\t\t</configuration>\n\t\t<configuration configurationName=\"Release\"/>\n\t</storageModule>\n</cproject>\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/.project",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<projectDescription>\n\t<name>ibrahim16-deep-act-rec-part</name>\n\t<comment></comment>\n\t<projects>\n\t</projects>\n\t<buildSpec>\n\t\t<buildCommand>\n\t\t\t<name>org.eclipse.cdt.managedbuilder.core.genmakebuilder</name>\n\t\t\t<arguments>\n\t\t\t</arguments>\n\t\t</buildCommand>\n\t\t<buildCommand>\n\t\t\t<name>org.eclipse.cdt.managedbuilder.core.ScannerConfigBuilder</name>\n\t\t\t<triggers>full,incremental,</triggers>\n\t\t\t<arguments>\n\t\t\t</arguments>\n\t\t</buildCommand>\n\t</buildSpec>\n\t<natures>\n\t\t<nature>org.eclipse.cdt.core.cnature</nature>\n\t\t<nature>org.eclipse.cdt.core.ccnature</nature>\n\t\t<nature>org.eclipse.cdt.managedbuilder.core.managedBuildNature</nature>\n\t\t<nature>org.eclipse.cdt.managedbuilder.core.ScannerConfigNature</nature>\n\t</natures>\n</projectDescription>\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Debug/apps/subdir.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n# Add inputs and outputs from these tool invocations to the build variables \nCPP_SRCS += \\\n../apps/exePhase3.cpp \n\nOBJS += \\\n./apps/exePhase3.o \n\nCPP_DEPS += \\\n./apps/exePhase3.d \n\n\n# Each subdirectory must supply rules for building sources it contributes\napps/%.o: ../apps/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\tg++ -I/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include -I/usr/include/openblas -I/cs/vml2/msibrahi/workspaces/caffe-lstm/include -I/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src -I/cs/vml2/msibrahi/workspaces/software/dlib -I/usr/local/cuda-6.5/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include -I/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7 -O0 -g3 -Wall -c -fmessage-length=0 -std=c++0x -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Debug/makefile",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n-include ../makefile.init\n\nRM := rm -rf\n\n# All of the sources participating in the build are defined here\n-include sources.mk\n-include src/subdir.mk\n-include apps/subdir.mk\n-include subdir.mk\n-include objects.mk\n\nifneq ($(MAKECMDGOALS),clean)\nifneq ($(strip $(CC_DEPS)),)\n-include $(CC_DEPS)\nendif\nifneq ($(strip $(C++_DEPS)),)\n-include $(C++_DEPS)\nendif\nifneq ($(strip $(C_UPPER_DEPS)),)\n-include $(C_UPPER_DEPS)\nendif\nifneq ($(strip $(CXX_DEPS)),)\n-include $(CXX_DEPS)\nendif\nifneq ($(strip $(CPP_DEPS)),)\n-include $(CPP_DEPS)\nendif\nifneq ($(strip $(C_DEPS)),)\n-include $(C_DEPS)\nendif\nendif\n\n-include ../makefile.defs\n\n# Add inputs and outputs from these tool invocations to the build variables \n\n# All Target\nall: ibrahim16-deep-act-rec-part\n\n# Tool invocations\nibrahim16-deep-act-rec-part: $(OBJS) $(USER_OBJS)\n\t@echo 'Building target: $@'\n\t@echo 'Invoking: GCC C++ Linker'\n\tg++ -L/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/lib -L/cs/vml2/msibrahi/workspaces/caffe-lstm/build/lib -L/usr/local/lib -L/usr/lib -L/usr/local/cuda-6.5/lib64 -L/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/lib -L/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/lib -L/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/lib -o \"ibrahim16-deep-act-rec-part\" $(OBJS) $(USER_OBJS) $(LIBS)\n\t@echo 'Finished building target: $@'\n\t@echo ' '\n\n# Other Targets\nclean:\n\t-$(RM) $(CC_DEPS)$(C++_DEPS)$(EXECUTABLES)$(C_UPPER_DEPS)$(CXX_DEPS)$(OBJS)$(CPP_DEPS)$(C_DEPS) ibrahim16-deep-act-rec-part\n\t-@echo ' '\n\n.PHONY: all clean dependents\n.SECONDARY:\n\n-include ../makefile.targets\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Debug/objects.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\nUSER_OBJS := /cs/vml2/msibrahi/workspaces/software/dlib/examples/build/dlib_build/libdlib.a\n\nLIBS := -lprotobuf -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_ml -lleveldb -lgflags -lleveldb -lglog -lboost_system -lboost_filesystem -lboost_chrono -lboost_python -lpython2.7 -lcaffe\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Debug/sources.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\nC_UPPER_SRCS := \nCXX_SRCS := \nC++_SRCS := \nOBJ_SRCS := \nCC_SRCS := \nASM_SRCS := \nCPP_SRCS := \nC_SRCS := \nO_SRCS := \nS_UPPER_SRCS := \nCC_DEPS := \nC++_DEPS := \nEXECUTABLES := \nC_UPPER_DEPS := \nCXX_DEPS := \nOBJS := \nCPP_DEPS := \nC_DEPS := \n\n# Every subdirectory with source files must be described here\nSUBDIRS := \\\nsrc \\\napps \\\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Debug/src/subdir.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n# Add inputs and outputs from these tool invocations to the build variables \nCPP_SRCS += \\\n../src/dlib-tracker-wrapper.cpp \\\n../src/images-utilities.cpp \\\n../src/leveldb-reader.cpp \\\n../src/leveldb-writer.cpp \\\n../src/rect-helper.cpp \\\n../src/utilities.cpp \\\n../src/volleyball-dataset-mgr.cpp \n\nOBJS += \\\n./src/dlib-tracker-wrapper.o \\\n./src/images-utilities.o \\\n./src/leveldb-reader.o \\\n./src/leveldb-writer.o \\\n./src/rect-helper.o \\\n./src/utilities.o \\\n./src/volleyball-dataset-mgr.o \n\nCPP_DEPS += \\\n./src/dlib-tracker-wrapper.d \\\n./src/images-utilities.d \\\n./src/leveldb-reader.d \\\n./src/leveldb-writer.d \\\n./src/rect-helper.d \\\n./src/utilities.d \\\n./src/volleyball-dataset-mgr.d \n\n\n# Each subdirectory must supply rules for building sources it contributes\nsrc/%.o: ../src/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\tg++ -I/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include -I/usr/include/openblas -I/cs/vml2/msibrahi/workspaces/caffe-lstm/include -I/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src -I/cs/vml2/msibrahi/workspaces/software/dlib -I/usr/local/cuda-6.5/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include -I/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7 -O0 -g3 -Wall -c -fmessage-length=0 -std=c++0x -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Release/apps/subdir.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n# Add inputs and outputs from these tool invocations to the build variables \nCPP_SRCS += \\\n../apps/exePhase3.cpp \n\nOBJS += \\\n./apps/exePhase3.o \n\nCPP_DEPS += \\\n./apps/exePhase3.d \n\n\n# Each subdirectory must supply rules for building sources it contributes\napps/%.o: ../apps/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\tg++ -I/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include -I/usr/include/openblas -I/cs/vml2/msibrahi/workspaces/caffe-lstm/include -I/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src -I/cs/vml2/msibrahi/workspaces/software/dlib -I/usr/local/cuda-6.5/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include -I/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7 -O3 -Wall -c -fmessage-length=0 -std=c++0x -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Release/makefile",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n-include ../makefile.init\n\nRM := rm -rf\n\n# All of the sources participating in the build are defined here\n-include sources.mk\n-include src/subdir.mk\n-include apps/subdir.mk\n-include subdir.mk\n-include objects.mk\n\nifneq ($(MAKECMDGOALS),clean)\nifneq ($(strip $(CC_DEPS)),)\n-include $(CC_DEPS)\nendif\nifneq ($(strip $(C++_DEPS)),)\n-include $(C++_DEPS)\nendif\nifneq ($(strip $(C_UPPER_DEPS)),)\n-include $(C_UPPER_DEPS)\nendif\nifneq ($(strip $(CXX_DEPS)),)\n-include $(CXX_DEPS)\nendif\nifneq ($(strip $(CPP_DEPS)),)\n-include $(CPP_DEPS)\nendif\nifneq ($(strip $(C_DEPS)),)\n-include $(C_DEPS)\nendif\nendif\n\n-include ../makefile.defs\n\n# Add inputs and outputs from these tool invocations to the build variables \n\n# All Target\nall: ibrahim16-deep-act-rec-part\n\n# Tool invocations\nibrahim16-deep-act-rec-part: $(OBJS) $(USER_OBJS)\n\t@echo 'Building target: $@'\n\t@echo 'Invoking: GCC C++ Linker'\n\tg++ -L/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/lib -L/cs/vml2/msibrahi/workspaces/caffe-lstm/build/lib -L/usr/local/lib -L/usr/lib -L/usr/local/cuda-6.5/lib64 -L/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/lib -L/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/lib -L/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/lib -o \"ibrahim16-deep-act-rec-part\" $(OBJS) $(USER_OBJS) $(LIBS)\n\t@echo 'Finished building target: $@'\n\t@echo ' '\n\n# Other Targets\nclean:\n\t-$(RM) $(CC_DEPS)$(C++_DEPS)$(EXECUTABLES)$(C_UPPER_DEPS)$(CXX_DEPS)$(OBJS)$(CPP_DEPS)$(C_DEPS) ibrahim16-deep-act-rec-part\n\t-@echo ' '\n\n.PHONY: all clean dependents\n.SECONDARY:\n\n-include ../makefile.targets\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Release/objects.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\nUSER_OBJS := /cs/vml2/msibrahi/workspaces/software/dlib/examples/build/dlib_build/libdlib.a\n\nLIBS := -lprotobuf -lopencv_core -lopencv_highgui -lopencv_imgproc -lopencv_ml -lgflags -lleveldb -lglog -lboost_system -lboost_filesystem -lboost_chrono -lboost_python -lpython2.7 -lcaffe\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Release/sources.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\nC_UPPER_SRCS := \nCXX_SRCS := \nC++_SRCS := \nOBJ_SRCS := \nCC_SRCS := \nASM_SRCS := \nCPP_SRCS := \nC_SRCS := \nO_SRCS := \nS_UPPER_SRCS := \nCC_DEPS := \nC++_DEPS := \nEXECUTABLES := \nC_UPPER_DEPS := \nCXX_DEPS := \nOBJS := \nCPP_DEPS := \nC_DEPS := \n\n# Every subdirectory with source files must be described here\nSUBDIRS := \\\nsrc \\\napps \\\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/Release/src/subdir.mk",
    "content": "################################################################################\n# Automatically-generated file. Do not edit!\n################################################################################\n\n# Add inputs and outputs from these tool invocations to the build variables \nCPP_SRCS += \\\n../src/dlib-tracker-wrapper.cpp \\\n../src/images-utilities.cpp \\\n../src/leveldb-reader.cpp \\\n../src/leveldb-writer.cpp \\\n../src/rect-helper.cpp \\\n../src/utilities.cpp \\\n../src/volleyball-dataset-mgr.cpp \n\nOBJS += \\\n./src/dlib-tracker-wrapper.o \\\n./src/images-utilities.o \\\n./src/leveldb-reader.o \\\n./src/leveldb-writer.o \\\n./src/rect-helper.o \\\n./src/utilities.o \\\n./src/volleyball-dataset-mgr.o \n\nCPP_DEPS += \\\n./src/dlib-tracker-wrapper.d \\\n./src/images-utilities.d \\\n./src/leveldb-reader.d \\\n./src/leveldb-writer.d \\\n./src/rect-helper.d \\\n./src/utilities.d \\\n./src/volleyball-dataset-mgr.d \n\n\n# Each subdirectory must supply rules for building sources it contributes\nsrc/%.o: ../src/%.cpp\n\t@echo 'Building file: $<'\n\t@echo 'Invoking: GCC C++ Compiler'\n\tg++ -I/rcg/software/Linux/RHEL/6/x86_64/LIB/OPENCV/3.0.0-CUDA65/include -I/usr/include/openblas -I/cs/vml2/msibrahi/workspaces/caffe-lstm/include -I/cs/vml2/msibrahi/workspaces/caffe-lstm/build/src -I/cs/vml2/msibrahi/workspaces/software/dlib -I/usr/local/cuda-6.5/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/BOOST/1.57.0/include -I/rcg/software/Linux/RHEL/6/x86_64/LIB/GLOG/0.3.3/include -I/rcg/software/Linux/RHEL/6/x86_64/LANG/PYTHON/2.7.6-SYSTEM/include/python2.7 -O3 -Wall -c -fmessage-length=0 -std=c++0x -MMD -MP -MF\"$(@:%.o=%.d)\" -MT\"$(@:%.o=%.d)\" -o \"$@\" \"$<\"\n\t@echo 'Finished building: $<'\n\t@echo ' '\n\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/apps/exePhase1_2.cpp",
    "content": "/*\n * w-driver-volleyball-lstm-evaluator.cpp\n *\n *  Created on: Jul 13, 2015\n *      Author: msibrahi\n */\n\n#include <stdio.h>\n#include <stdlib.h>\n\n#include <iostream>\n#include <vector>\n#include <set>\n#include <map>\nusing std::vector;\nusing std::set;\nusing std::map;\nusing std::pair;\nusing std::endl;\nusing std::cout;\n\n#include \"../src/leveldb-writer.h\"\n#include \"../src/custom-macros.h\"\n#include \"../src/rect-helper.h\"\n#include \"../src/utilities.h\"\n#include \"../src/images-utilities.h\"\n#include \"../src/custom-images-macros.h\"\n#include \"../src/dlib-tracker-wrapper.h\"\n#include \"../src/volleyball-dataset-mgr.h\"\nusing MostCV::VolleyballPerson;\nusing MostCV::VolleyballVideoData;\nusing MostCV::VolleyballDatasetPart;\nusing MostCV::VolleyballDatasetMgr;\nusing MostCV::RectHelper;\n\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\n\nconst int resize_width = 256;\nconst int resize_height = 256;\nconst int num_channels = 3;\nconst int kPlayersCount = 12;\n\n/////////////////////////////////////////////////////////////////////////////////////////////////////////\n\nint main(int argc, char** argv) {\n  string program_name = MostCV::consumeStringParam(argc, argv);\n\n  cerr << \"Start: \" << program_name << endl;\n  // read program entry data\n  string dataset_videos_path = MostCV::consumeStringParam(argc, argv);\n  string config_path = MostCV::consumeStringParam(argc, argv);\n  string leveldb_output_path = MostCV::consumeStringParam(argc, argv);\n  int temporal_window = MostCV::consumeIntParam(argc, argv);\n  int step = MostCV::consumeIntParam(argc, argv);\n  int bIsPrepareLSTMData = MostCV::consumeIntParam(argc, argv); // otherwise fusion data\n\n  if (bIsPrepareLSTMData)\n    cerr << \"LSTM 1 preparation\" << endl;\n  else\n    cerr << \"Data Fusion for LSTM 2\" << endl;\n\n  assert(temporal_window > 0);\n  MostCV::fixDir(config_path);\n  MostCV::fixDir(dataset_videos_path);\n  MostCV::fixDir(leveldb_output_path);\n\n  cerr << \"Loading the dataset...\" << endl;\n  VolleyballDatasetMgr mgr(config_path, dataset_videos_path);\n\n  cerr << \"Temporal window = \" << temporal_window << \" with step = \" << step << \"\\n\\n\";\n\n  vector<Ptr<MostCV::LeveldbWriter> > dbMgrs;\n  Mat blackRectImage = Mat::zeros(resize_width, resize_height, CV_8UC3);\n\n  /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n  // Create leveldb datasets\n  for (auto &dataset : mgr.dataset_division_) {\n    dataset.dataset_db_name_ = dataset.dataset_name_ + \"-leveldb\";\n    dataset.dataset_db_path_ = leveldb_output_path + dataset.dataset_db_name_;\n\n    MostCV::fixDir(dataset.dataset_db_path_);\n\n    cerr<<\"Creating a new dataset\\n\";\n    dbMgrs.push_back(new MostCV::LeveldbWriter(dataset.dataset_db_path_, resize_height, resize_width, num_channels, false));\n\n    if (bIsPrepareLSTMData)\n      dbMgrs.back()->setLabelsRange(mgr.total_persons_labels);\n    else\n      dbMgrs.back()->setLabelsRange(mgr.total_scene_labels);\n  }\n\n  /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n  int dataset_pos = 0;\n  boost::mt19937 generator(100);\n  boost::uniform_int<> uni_dist;\n  boost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(generator, uni_dist);\n\n  for (auto dataset : mgr.dataset_division_) {\n\n    // Shuffle data before use\n    cerr << \"Extracting shuffled elements from \" << dataset.dataset_name_ << \" Data Set. Total videos = \" << dataset.videos_vec_.size() << \"\\n\";\n\n    Ptr<MostCV::LeveldbWriter> dbMgr = dbMgrs[dataset_pos++];\n    vector<pair<VolleyballVideoData, string> > database_shuffled;\n\n    for (auto video : dataset.videos_vec_) {\n      for (auto frame_id : video.annot_frame_id_vec_)\n        database_shuffled.push_back(std::make_pair(video, frame_id));\n    }\n\n    std::random_shuffle(database_shuffled.begin(), database_shuffled.end(), rand_generator);\n\n    if (bIsPrepareLSTMData) {\n      cerr << \"Total images for current data set is \" << database_shuffled.size() << \". Overall entries will be <= \"\n           << temporal_window * database_shuffled.size() * kPlayersCount << endl;\n    } else {\n      cerr << \"Total images for current data set is \" << database_shuffled.size() << \". Overall entries will be = \"\n           << temporal_window * database_shuffled.size() * kPlayersCount << endl;\n    }\n    /////////////////////////////////////////////////////////////////////////////////////////////////////////\n\n    for (auto database_entry : database_shuffled) {\n      auto video = database_entry.first;\n      string frame_id = database_entry.second;\n      int frame_label = video.annot_frame_id_to_activity_id_map_[frame_id];\n\n      // prepare tracking data\n      pair<vector<string>, vector<string> > images_paths_seq = video.GetTemporalWindowPaths(frame_id, temporal_window, step, false);\n\n      vector<Mat> imagesSequenceBefore, imagesSequenceAfter;\n      Mat img;\n\n      for (auto path : images_paths_seq.first)\n        imagesSequenceBefore.push_back(cv::imread(path));\n\n      for (auto path : images_paths_seq.second)\n        imagesSequenceAfter.push_back(cv::imread(path));\n\n      if (imagesSequenceAfter.size())\n        img = imagesSequenceAfter.back();\n      else\n        img = imagesSequenceBefore.back();\n\n      assert(!img.empty());\n\n      vector<VolleyballPerson> &persons = video.annot_frame_id_persons_map_[frame_id];\n      vector<Mat> images;\n      vector<vector<Rect> > persons_tracklets;\n\n      for (auto person : persons) {\n        MostCV::DlibTrackerWrapper tracker(person.bbox_.r);\n        pair<vector<Mat>, vector<Rect> > tracklet = tracker.Process(imagesSequenceBefore, imagesSequenceAfter);\n\n        images = tracklet.first;\n        persons_tracklets.push_back(tracklet.second);\n      }\n\n      // generates temporal_window * kPlayersCount * frames\n      int seq_id = 0, person_pos = 0;\n\n      for (auto tracklet : persons_tracklets) {\n        int rect_pos = 0;\n\n        for (auto img : images) {\n          dbMgr->clearDatum();\n          //MostCV::ShowImage(img(tracklet[rect_pos]));\n          assert(dbMgr->addImageToDatum(img(tracklet[rect_pos]), num_channels));\n\n          if (bIsPrepareLSTMData)\n            dbMgr->setDatumLabel(persons[person_pos].action_id_);\n          else\n            dbMgr->setDatumLabel(frame_label);\n\n          dbMgr->addDatumToBatch(video.video_id_ + \"_\" + frame_id + \"_\" + MostCV::toIntStr(\"000\", seq_id++));\n          rect_pos++;\n        }\n        ++person_pos;\n      }\n\n      // for missing persons, add zero images\n      if (!bIsPrepareLSTMData) {\n        LP(j, kPlayersCount - persons_tracklets.size())\n        {\n          LP(k, temporal_window)\n          {\n            dbMgr->clearDatum();\n            assert(dbMgr->addImageToDatum(blackRectImage, num_channels));\n            dbMgr->setDatumLabel(frame_label);\n            dbMgr->addDatumToBatch(video.video_id_ + \"_\" + frame_id + \"_\" + MostCV::toIntStr(\"000\", seq_id++));\n          }\n        }\n      }\n    }\n    dbMgr->forceFinalize();\n  }\n\n  cerr << \"\\n\\nBye: \" << program_name << endl;\n\n  return 0;\n}\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/apps/exePhase3.cpp",
    "content": "#include <stdio.h>\n#include <string>\n#include <iostream>\n#include <vector>\n#include <set>\nusing std::vector;\nusing std::set;\nusing std::string;\nusing std::pair;\nusing std::endl;\nusing std::cout;\n\n#include \"boost/algorithm/string.hpp\"\n#include \"google/protobuf/text_format.h\"\n\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\nusing caffe::Layer;\nusing caffe::LayerParameter;\nusing caffe::DataParameter;\nusing caffe::NetParameter;\nusing boost::shared_ptr;\nnamespace db = caffe::db;\n\n#include \"../src/utilities.h\"\n#include \"../src/leveldb-reader.h\"\n#include \"../src/leveldb-writer.h\"\n\nenum fuse_style {\n  concatenate_players = 0,\n  max_pool_players_1 = 1,   //  all players in one vec of feature mid size\n  max_pool_players_2 = 2,\n  max_pool_players_4 = 3,   // divide the ground 4 blocks and max pool it. E.g. in 16 players, each 4 has max pool\n  avg_pool_players_1 = 4,\n  avg_pool_players_2 = 5,\n  avg_pool_players_4 = 6,\n  sum_pool_players_1 = 7,\n  sum_pool_players_2 = 8,\n  sum_pool_players_4 = 9\n};\n\nstring fuse_style_sz[] = { \"concatenate_players\", \"max_pool_players_1\", \"max_pool_players_2\", \"max_pool_players_4\", \"avg_pool_players_1\", \"avg_pool_players_2\",\n    \"avg_pool_players_4\", \"sum_pool_players_1\", \"sum_pool_players_2\", \"sum_pool_players_4\" };\n\nint target_fuse_style = concatenate_players;\nconst int kPlayersCount = 12;\n\n\n\n\n\n\n\n\n\nvoid RemoveLastBlock(vector<float> &input, int block_length) {\n  assert((int )input.size() >= block_length);\n\n  for (int i = 0; i < block_length; ++i)\n    input.pop_back();\n}\n\nvoid AddLastBlock(vector<float> &input, int block_length) {\n\n  for (int i = 0; i < block_length; ++i)\n    input.push_back(0);\n}\n\nvoid RemoveDummyVectors(vector<float> &input, int block_length) {\n  bool is_all_zeros = true;\n\n  while (is_all_zeros && (int) input.size() > block_length) {  // Leave at least 1 block\n    int last_idx = input.size() - 1;\n    for (int i = 0; i < is_all_zeros && block_length; ++i)\n      is_all_zeros &= input[last_idx - i] == 0;\n\n    if (is_all_zeros)\n      RemoveLastBlock(input, block_length);\n  }\n}\n\n// target_blocks_cnt = 1 => merge all sub-vectors in 1 block\n// target_blocks_cnt = 4 => merge every set of consecutive sub-vectors to get total 4 blocks\nvector<float> VectorsFusing(vector<float> &input, int block_length, int target_blocks_cnt) {\n\n  // I fixed bug here...hopefully not big problem!\n\n  if (target_fuse_style == avg_pool_players_1 || target_fuse_style == sum_pool_players_1 || target_fuse_style == max_pool_players_1)\n    RemoveDummyVectors(input, block_length);\n  else if (target_fuse_style == concatenate_players) {\n    int cur_blocks = input.size() / block_length;\n\n    // then we need specific count of boxes\n    assert(cur_blocks >= kPlayersCount);\n\n    while (cur_blocks > kPlayersCount) {\n      --cur_blocks;\n      RemoveLastBlock(input, block_length);\n    }\n  } else {\n    RemoveDummyVectors(input, block_length);\n\n    while (input.size() > 0 && (input.size() % (block_length * target_blocks_cnt) != 0))\n      AddLastBlock(input, block_length);\n  }\n\n  vector<float> output;\n  const float* pData = &input[0];\n  if (input.size() % (block_length * target_blocks_cnt) != 0) {\n    cerr << \"Error A%(B*C) != 0 => \" << input.size() << \" \" << block_length << \" \" << target_blocks_cnt << \"\\n\";\n    assert(input.size() % (block_length * target_blocks_cnt) == 0);\n  }\n  int merge_blocks_cnt = input.size() / (block_length * target_blocks_cnt);  // merge cnt\n\n  for (int i = 0; i < (int) input.size(); i += block_length * merge_blocks_cnt) {\n    int t = merge_blocks_cnt;\n\n    vector<float> sub_output(block_length);\n\n    for (int j = 0; j < block_length; ++j)\n      sub_output[j] = pData[j];\n\n    pData += block_length;\n    --t;\n\n    while (t--) {\n      for (int j = 0; j < block_length; ++j) {\n        if (target_fuse_style == avg_pool_players_1 || target_fuse_style == avg_pool_players_2 || target_fuse_style == avg_pool_players_4\n            || target_fuse_style == sum_pool_players_1|| target_fuse_style == sum_pool_players_2|| target_fuse_style == sum_pool_players_4)\n          sub_output[j] += pData[j];\n        else\n          sub_output[j] = std::max(sub_output[j], pData[j]);\n      }\n\n      pData += block_length;\n    }\n\n    for (auto val : sub_output)\n      output.push_back(val);\n  }\n\n  if (target_fuse_style == avg_pool_players_1 || target_fuse_style == avg_pool_players_2 || target_fuse_style == avg_pool_players_4) {\n    for (auto &val : output)\n      val /= merge_blocks_cnt;\n  }\n\n  return output;\n}\n\n\n\ntemplate<typename Dtype>\nvoid feature_extraction_pipeline(int &argc, char** &argv) {\n\n  target_fuse_style = MostCV::consumeIntParam(argc, argv, \"target_fuse_style\");\n  LOG(ERROR)<< \"Fusing style = \"<<fuse_style_sz[target_fuse_style] <<\"\\n\\n\";\n\n  int frames_window = MostCV::consumeIntParam(argc, argv, \"frames_window\");\n  LOG(ERROR)<< \"frames_window = \" << frames_window;\n\n  LOG(ERROR)<< \"Expected batch size = \"<<kPlayersCount * frames_window;\n\n  string computation_mode = MostCV::consumeStringParam(argc, argv);\n\n  if (strcmp(computation_mode.c_str(), \"GPU\") == 0) {\n    uint device_id = MostCV::consumeIntParam(argc, argv, \"device_id\");\n\n    LOG(ERROR)<< \"Using GPU\";\n    LOG(ERROR)<< \"Using Device_id=\" << device_id;\n\n    Caffe::SetDevice(device_id);\n    Caffe::set_mode(Caffe::GPU);\n  } else {\n    LOG(ERROR)<< \"Using CPU\";\n    Caffe::set_mode(Caffe::CPU);\n  }\n\n  string pretrained_binary_proto(MostCV::consumeStringParam(argc, argv));\n  string feature_extraction_proto(MostCV::consumeStringParam(argc, argv));\n\n  LOG(ERROR)<<\"Model: \"<<pretrained_binary_proto;\n  LOG(ERROR)<<\"Proto: \"<<feature_extraction_proto;\n\n  LOG(ERROR)<<\"Creating the test network\\n\";\n  shared_ptr<Net<Dtype> > feature_extraction_net(new Net<Dtype>(feature_extraction_proto, caffe::Phase::TEST));\n\n  LOG(ERROR)<<\"Loading the Model\\n\";\n  feature_extraction_net->CopyTrainedLayersFrom(pretrained_binary_proto);\n\n  vector<string> blob_names_vec;\n\n  int blobs_cnt = MostCV::consumeIntParam(argc, argv, \"blobs_cnt\");\n\n  assert(blobs_cnt > 0);\n\n  LOG(ERROR)<<\"# of blobs is \"<<blobs_cnt;\n\n  LP(i, blobs_cnt)\n  {\n    string blob_name = MostCV::consumeStringParam(argc, argv);\n\n    LOG(ERROR)<<\"blob_name: \"<<blob_name;\n\n    CHECK(feature_extraction_net->has_blob(blob_name)) << \"Unknown feature blob name \" << blob_name << \" in the network \" << feature_extraction_proto;\n\n    blob_names_vec.push_back(blob_name);\n  }\n\n  string output_dataset_name = MostCV::consumeStringParam(argc, argv);\n\n  int num_mini_batches = MostCV::consumeIntParam(argc, argv, \"num_mini_batches\");\n\n  LOG(ERROR)<<\"num_mini_batches: \"<<num_mini_batches;\n\n  MostCV::LeveldbWriter leveldbWriter(output_dataset_name);\n\n  Datum datum;\n  const int kMaxKeyStrLength = 100;\n  char key_str[kMaxKeyStrLength];\n  vector<Blob<float>*> input_vec;\n  int db_entry_idx = 0;\n  int batch_size = -1;\n  int dim_features = -1;\n\n  std::set<int> batch_labels;        // all our batch value must be same\n  std::set<int> dataset_labels;   // logically database shouldn't have only 1 label\n\n  for (int batch_index = 0; batch_index < num_mini_batches; ++batch_index) {  // e.g. 100 iterations. Probably roll on data if needed\n    feature_extraction_net->Forward(input_vec);  // Take one batch of data (e.g. 50 images), and pass them to end of network\n\n    // Load the Labels\n    const shared_ptr<Blob<Dtype> > label_blob = feature_extraction_net->blob_by_name(\"label\");\n\n    batch_size = label_blob->num();                   // e.g. 16 batches for volleyball..represents the boxes of a frame.\n\n    assert(batch_size == frames_window * kPlayersCount);\n\n    batch_labels.clear();\n    int current_label = -1;\n\n    for (int n = 0; n < batch_size; ++n) {\n      const Dtype* label_blob_data = label_blob->cpu_data() + label_blob->offset(n);  // move offset to ith blob in batch\n      current_label = label_blob_data[0];  // all will be same value\n      batch_labels.insert(current_label);\n      dataset_labels.insert(current_label);\n    }\n\n    if (batch_labels.size() != 1) {  // every 1 batch should have same value\n      cerr << \"\\n\\nERROR. Every 1 batch should have the same value. Inconsistent batch # \" << batch_index + 1 << \"-th\\n\";\n      cerr << \"Overall unique labels are: \" << batch_labels.size() << \". The appeared labels are: \";\n\n      for(auto label : batch_labels)\n        cerr<<label<<\" \";\n      cerr<<\"\\n\";\n      assert(false);\n    }\n    vector<shared_ptr<Blob<Dtype> > > feature_blob_vec;\n\n    for (auto blob_name : blob_names_vec) {\n      shared_ptr<Blob<Dtype> > feature_blob = feature_extraction_net->blob_by_name(blob_name);  // get e.g. fc7 blob for the batch\n      feature_blob_vec.push_back(feature_blob);\n    }\n\n    int total_dim_features = 0;\n    static bool print_once_feature_vec = true;\n\n    if (print_once_feature_vec)\n      LOG(ERROR)<<\"\\n\\n\";\n\n    for (auto feature_blob : feature_blob_vec) {\n      dim_features = feature_blob->count() / batch_size;  // e.g. 4096\n      total_dim_features += dim_features;                 // e.g. 4096 of fc7 + 250 of lstm1\n\n      if (print_once_feature_vec)\n        LOG(ERROR)<<\"ith Vector Length = \"<<dim_features;\n      }\n\n    vector<vector<float> > window_feature_vecs(frames_window);\n\n    for (int n = 0; n < batch_size; ++n) {\n      for (auto feature_blob : feature_blob_vec) {\n        dim_features = feature_blob->count() / batch_size;  // e.g. 4096\n\n        const Dtype* feature_blob_data = feature_blob->cpu_data() + feature_blob->offset(n);  // move offset to ith blob in batch\n\n        int p = n % frames_window;\n        for (int d = 0; d < dim_features; ++d)\n          window_feature_vecs[p].push_back(feature_blob_data[d]);\n      }\n    }\n    for (auto &feature_vec : window_feature_vecs) {\n      if (target_fuse_style == max_pool_players_1 || target_fuse_style == avg_pool_players_1 || target_fuse_style == sum_pool_players_1)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 1);\n      else if (target_fuse_style == max_pool_players_2 || target_fuse_style == avg_pool_players_2 || target_fuse_style == sum_pool_players_2)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 2);\n      else if (target_fuse_style == max_pool_players_4 || target_fuse_style == avg_pool_players_4 || target_fuse_style == sum_pool_players_4)\n        feature_vec = VectorsFusing(feature_vec, total_dim_features, 4);\n      // otherwise, keep it concatenated\n\n      if (print_once_feature_vec)\n        LOG(ERROR)<<\"Fused Vector Length = \"<<feature_vec.size();\n\n      print_once_feature_vec = false;\n\n      datum.set_width(1);\n      datum.set_channels(1);\n      datum.clear_data();\n      datum.clear_float_data();\n      datum.set_height(feature_vec.size());\n\n      for (int p = 0; p < (int) feature_vec.size(); ++p)\n        datum.add_float_data(feature_vec[p]);\n\n      int length = snprintf(key_str, kMaxKeyStrLength, \"%010d\", db_entry_idx);  // \"%010d\" BUG fix\n      leveldbWriter.addDatumToBatch(datum, string(key_str, length), current_label);\n      ++db_entry_idx;\n      feature_vec.clear();\n    }\n  }\n  leveldbWriter.forceFinalize();\n\n  assert(dataset_labels.size() > 1);  // some variety make sense!\n}\n\n\n\n\n\n\n\n\nint main(int argc, char** argv) {\n  ::google::InitGoogleLogging(argv[0]);\n  MostCV::consumeStringParam(argc, argv);  // read program entry data\n\n  LOG(ERROR)<< \"Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\\n\\n\";\n\n  // as long as chucks of data\n  while (argc) {\n    if (argc < 6) {\n      LOG(ERROR)<< \"At least 6 parameters expected\\n\";\n      assert(false);\n    }\n\n    feature_extraction_pipeline<float>(argc, argv);\n    LOG(ERROR)<< \"\\n\\nSuccessfully extracted the features!\\n\\n\";\n  }\n\n  return 0;\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/apps/exePhase4.cpp",
    "content": "/*\n * w-driver-volleyball-lstm-evaluator.cpp\n *\n *  Created on: Jul 13, 2015\n *      Author: msibrahi\n */\n\n#include <iostream>\n#include <vector>\n#include <stdio.h>\n#include <string>\n#include <set>\n#include <set>\n#include <map>\n#include <iomanip>\nusing std::vector;\nusing std::set;\nusing std::multiset;\nusing std::map;\nusing std::pair;\nusing std::string;\nusing std::endl;\nusing std::cerr;\n\n#include \"boost/algorithm/string.hpp\"\n#include \"google/protobuf/text_format.h\"\n\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\nusing caffe::Layer;\nusing caffe::LayerParameter;\nusing caffe::DataParameter;\nusing caffe::NetParameter;\nusing boost::shared_ptr;\nnamespace db = caffe::db;\n\n#include \"../src/utilities.h\"\n#include \"../src/leveldb-reader.h\"\n\nvoid evaluate(vector<int> truthLabels, vector<int> resultLabels, int w) {\n\n  set<int> total_labels;\n  map<int, map<int, int> > confusion_freq_maps;\n  map<int, int> label_freq;\n  int correct = 0;\n\n  cerr<<\"\\n\\n\";\n  for (int i = 0; i < (int) truthLabels.size(); ++i) {\n    correct += truthLabels[i] == resultLabels[i];\n\n    cerr << \"Test \" << i + 1 << \": Result = \" << resultLabels[i] << \" GroundTruth = \" << truthLabels[i] << \"\\n\";\n\n    confusion_freq_maps[truthLabels[i]][resultLabels[i]]++;\n    total_labels.insert(truthLabels[i]);\n    total_labels.insert(resultLabels[i]);\n    label_freq[truthLabels[i]]++;\n  }\n\n  cerr.setf(std::ios::fixed);\n  cerr.precision(2);\n\n  cerr<<\"\\n\\n\";\n  cerr << \"Total testing frames: \" << truthLabels.size() << \" with temporal window: \" << w << \"\\n\";\n  cerr << \"Temporal accuracy : \" << 100.0 * correct / truthLabels.size() << \" %\\n\";\n  cerr << \"\\n=======================================================================================\\n\";\n\n  cerr << \"\\nConfusion Matrix - Truth (col) / Result(row)\\n\\n\";\n\n  cerr << std::setw(5) << \"T/R\" << \": \";\n\n  for (auto r_label : total_labels)\n    cerr << std::setw(5) << r_label;\n  cerr << \"\\n=======================================================================================\\n\";\n\n  for (auto t_label : total_labels) {\n    int sum = 0;\n    cerr << std::setw(5) << t_label << \": \";\n\n    for (auto r_label : total_labels)\n    {\n      cerr << std::setw(5) << confusion_freq_maps[t_label][r_label];\n\n      sum += confusion_freq_maps[t_label][r_label];\n    }\n\n    double percent = 0;\n\n    if (label_freq[t_label] > 0)\n      percent = 100.0 * confusion_freq_maps[t_label][t_label] / label_freq[t_label];\n\n    cerr << \" \\t=> Total Correct = \" << std::setw(5) << confusion_freq_maps[t_label][t_label] << \" / \" << std::setw(5) << sum << \" = \" << percent << \" %\\n\";\n  }\n\n\n  cerr<<\"\\n\\n\";\n  cerr << std::setw(7) << \"T/R\" << \": \";\n\n  for (auto r_label : total_labels)\n    cerr << std::setw(7) << r_label;\n  cerr << \"\\n=======================================================================================\\n\";\n\n  for (auto t_label : total_labels) {\n    cerr << std::setw(7) << t_label << \": \";\n\n    for (auto r_label : total_labels)\n    {\n      double percent = 0;\n\n      if (label_freq[t_label] > 0)\n        percent = 100.0 * confusion_freq_maps[t_label][r_label] / label_freq[t_label];\n\n      cerr << std::setw(7) << percent;\n    }\n    cerr<<\"\\n\";\n  }\n\n  cerr<<\"\\nTo get labels corresponding to IDs..see dataset loading logs\\n\";\n}\n\nint getArgmax(vector<float> &v) {\n  int pos = 0;\n\n  assert(v.size() > 0);\n\n  for (int j = 1; j < (int) v.size(); ++j) {\n    if (v[j] > v[pos])\n      pos = j;\n  }\n  return pos;\n}\n\ntemplate<typename Dtype>\nvoid feature_extraction_pipeline(int &argc, char** &argv) {\n\n  int frames_window = MostCV::consumeIntParam(argc, argv);\n  LOG(ERROR)<< \"Temporal Window = \" << frames_window;\n\n  string computation_mode = MostCV::consumeStringParam(argc, argv);\n\n  if (strcmp(computation_mode.c_str(), \"GPU\") == 0) {\n    uint device_id = MostCV::consumeIntParam(argc, argv);\n\n    LOG(ERROR)<< \"Using GPU\";\n    LOG(ERROR)<< \"Using Device_id = \" << device_id;\n\n    Caffe::SetDevice(device_id);\n    Caffe::set_mode(Caffe::GPU);\n  } else {\n    LOG(ERROR)<< \"Using CPU\";\n    Caffe::set_mode(Caffe::CPU);\n  }\n\n  string pretrained_binary_proto(MostCV::consumeStringParam(argc, argv));\n  string feature_extraction_proto(MostCV::consumeStringParam(argc, argv));\n\n  LOG(ERROR)<<\"Model: \"<<pretrained_binary_proto<<\"\\n\";\n  LOG(ERROR)<<\"Proto: \"<<feature_extraction_proto<<\"\\n\";\n\n  LOG(ERROR)<<\"Creating the test network\\n\";\n  shared_ptr<Net<Dtype> > feature_extraction_net(new Net<Dtype>(feature_extraction_proto, caffe::Phase::TEST));\n\n  LOG(ERROR)<<\"Loading the Model\\n\";\n  feature_extraction_net->CopyTrainedLayersFrom(pretrained_binary_proto);\n\n  string blob_name = MostCV::consumeStringParam(argc, argv);\n  LOG(ERROR)<<\"blob_name: \"<<blob_name<<\"\\n\";\n\n  CHECK(feature_extraction_net->has_blob(blob_name)) << \"Unknown feature blob name \" << blob_name << \" in the network \" << feature_extraction_proto;\n\n  int num_mini_batches = MostCV::consumeIntParam(argc, argv);\n  LOG(ERROR)<<\"num_mini_batches: \"<<num_mini_batches<<\"\\n\";\n\n  vector<Blob<float>*> input_vec;\n  int batch_size = -1;\n  int dim_features = -1;\n  std::set<int> labels;       // every (2w+1) * batch size MUST all have same label\n\n  vector<int> truthLabels;\n  vector<int> propAvgMaxResultLabels;\n\n  for (int batch_index = 0; batch_index < num_mini_batches; ++batch_index) {  // e.g. 100 iterations. Probably roll on data if needed\n    feature_extraction_net->Forward(input_vec);  // Take one batch of data (e.g. 50 images), and pass them to end of network\n\n    // Load the Labels\n    const shared_ptr<Blob<Dtype> > label_blob = feature_extraction_net->blob_by_name(\"label\");\n\n    batch_size = label_blob->num();                   // e.g. 50 batches\n\n    assert(batch_size == frames_window);\n\n    int current_label = -1;\n    for (int n = 0; n < batch_size; ++n) {\n      const Dtype* label_blob_data = label_blob->cpu_data() + label_blob->offset(n);  // move offset to ith blob in batch\n      current_label = label_blob_data[0];  // all will be same value\n      labels.insert(current_label);\n\n      if (n == 0)\n        truthLabels.push_back(current_label);\n    }\n\n    if (labels.size() != 1) {  // every 1 batch should have same value\n      LOG(ERROR)<< \"Something wrong. every 1 batch should have same value. New value at element \" << batch_index + 1 << \"\\n\";\n      assert(false);\n    }\n    labels.clear();\n\n    const shared_ptr<Blob<Dtype> > feature_blob = feature_extraction_net->blob_by_name(blob_name);  // get e.g. fc7 blob for the batch\n\n    dim_features = feature_blob->count() / batch_size;\n    assert(dim_features > 1);\n\n    const Dtype* feature_blob_data = nullptr;\n\n    vector<float> test_case_sum(dim_features);\n\n    for (int n = 0; n < batch_size; ++n) {\n      feature_blob_data = feature_blob->cpu_data() + feature_blob->offset(n);  // move offset to ith blob in batch\n\n      vector<float> test_case;\n      for (int j = 0; j < dim_features; ++j) {\n        test_case.push_back(feature_blob_data[j]);\n\n        test_case_sum[j] += feature_blob_data[j];\n      }\n    }\n\n    propAvgMaxResultLabels.push_back( getArgmax(test_case_sum) );\n  }\n\n  evaluate(truthLabels, propAvgMaxResultLabels, 1);\n}\n\nint main(int argc, char** argv) {\n  ::google::InitGoogleLogging(argv[0]);\n  MostCV::consumeStringParam(argc, argv);  // read program entry data\n\n  if (argc < 6) {\n    LOG(ERROR)<< \"At least 6 parameters expected\\n\";\n    assert(false);\n  }\n\n  LOG(ERROR)<< \"Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\\n\\n\";\n\n  feature_extraction_pipeline<float>(argc, argv);\n\n  return 0;\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/custom-abbreviation.h",
    "content": "/*\n * custom-abbreviation.h\n *\n *  Created on: 2015-06-08\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_ABBREVIATION_H_\n#define CUSTOM_ABBREVIATION_H_\n\n#include <cmath>\n\nnamespace MostCV\n{\n  typedef vector<int>       vi;\n  typedef vector<double>    vd;\n  typedef vector< vi >      vvi;\n  typedef vector< vd >      vvd;\n  typedef vector<string>    vs;\n  typedef long long         ll;\n  typedef long double       ld;\n  //typedef unsigned char   uchar;\n\n  const ll      OO = (ll)1e10;\n  const double    PI  = std::acos(-1.0);\n  const long double   EPS = (1e-15);\n\n  // 4 orthogonal directions, 4 diagonal directions and last is same position\n  //int DR11[9] = {1, 0, 0, -1, 1, 1, -1, -1, 0};\n  //int DC11[9] = {0, 1, -1, 0, -1, 1, -1, 1, 0};\n\n  enum DIRS_ENUM {Left, Right, Bottpm, Top};\n}\n\n\n#endif /* CUSTOM_ABBREVIATION_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/custom-images-macros.h",
    "content": "/*\n * custom-images-macros.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_IMAGES_MACROS_H_\n#define CUSTOM_IMAGES_MACROS_H_\n\nnamespace MostCV {\n\n#define REPIMG2(y, x, img)       for(int y=0;y<(int)(img.rows);++y) for(int x=0;x<(int)(img.cols);++x)\n#define REPIMG3(y, x, c, img)    for(int y=0;y<(int)(img.rows);++y) for(int x=0;x<(int)(img.cols);++x) for(int c=0;c<(int)(img.channels());++c)\n#define REPIMG_JUMP(y, x, dy, dx, img)       for(int y=0;y<(int)(img.rows);y+=dy) for(int x=0;x<(int)(img.cols);x+=dx)\n}\n\n#endif /* CUSTOM_IMAGES_MACROS_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/custom-macros.h",
    "content": "/*\n * custom-macros.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_MACROS_H_\n#define CUSTOM_MACROS_H_\n\nnamespace MostCV {\n\n#define ALL(v)        ((v).begin()), ((v).end())\n#define RALL(v)       ((v).rbegin()), ((v).rend())\n#define SZ(v)         ((int)((v).size()))\n#define CLR(v, d)     memset(v, d, sizeof(v))\n#define REP(i, v)     for(int i=0;i<SZ(v);++i)\n#define REPI(i, j, v)     for(int i=(j);i<SZ(v);++i)\n//#define REPIT(i, c) for(typeof((c).begin()) i = (c).begin(); i != (c).end(); i++)\n#define REPIT(i, c) for(auto i = (c).begin(); i != (c).end(); i++)\n#define LP(i, n)      for(int i=0;i<(int)(n);++i)\n#define LPI(i, j, n)  for(int i=(j);i<(int)(n);++i)\n#define LPD(i, j, n)    for(int i=(j);i>=(int)(n);--i)\n#define REPA(v)       lpi(i, 0, SZ(v)) lpi(j, 0, SZ(v[i]))\n\n// ToDo: http://www.quora.com/What-are-some-macros-that-are-used-in-programming-contests\n}\n\n#endif /* CUSTOM_MACROS_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/dlib-tracker-wrapper.cpp",
    "content": "/*\n * dlib-tracker-wrapper.cpp\n *\n *  Created on: 2015-06-22\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"dlib-tracker-wrapper.h\"\n#include \"custom-images-macros.h\"\n\n#include <iostream>\nusing std::cerr;\n\nnamespace MostCV {\n\nDlibTrackerWrapper::DlibTrackerWrapper(Rect initial_location) {\n  initial_location_ = initial_location;\n  step_ = 0;\n}\n\nRect DlibTrackerWrapper::UpdateTracker(Mat img) {\n  Rect img_rect = Rect(0, 0, img.cols-1, img.rows-1);\n  cv::Mat gray_img;\n\n  if (CV_8U != img.type() || 1 != img.channels())\n    cv::cvtColor(img, gray_img, cv::COLOR_BGR2GRAY);\n  else\n    gray_img = img;\n\n  dlib::array2d<uchar> dlib_img(gray_img.rows, gray_img.cols);\n\n  REPIMG2(y, x, gray_img)\n      dlib_img[y][x] = gray_img.at<uchar> (y, x);\n\n  if (step_ == 0) {\n    initial_location_ &= img_rect;  // Fix first one in case\n\n    if(initial_location_.area() == 0)\n    {\n      cerr<<\"Dlib: Empty rectangle for tracking! Let's do workaround\\n\";\n\n      initial_location_ = Rect(0, 0, 1, 1);\n    }\n\n    tracker_.start_track(dlib_img, dlib::centered_rect(dlib::point(initial_location_.x + initial_location_.width / 2, initial_location_.y + initial_location_.height / 2),\n                                                 initial_location_.width, initial_location_.height));\n    ++step_;\n    return initial_location_;\n  }\n\n  tracker_.update(dlib_img);\n  int y1 = tracker_.get_position().top();\n  int x1 = tracker_.get_position().left();\n  int y2 = tracker_.get_position().bottom();\n  int x2 = tracker_.get_position().right();\n\n  ++step_;\n\n  Rect rect = Rect(x1, y1, x2-x1, y2-y1);\n\n\n  rect &= img_rect;\n\n  if(rect.area() < 1)   // zero areas usually cause problems. Let's give them 1 area box\n    rect = Rect(0, 0, 1, 1);\n\n  return rect;\n}\n\n// back like: 0 -1 -2 -3  and forward 0 1 2 3 4 5 6. Helps when tracker centered on frame\npair<vector<Mat>, vector<Rect> > DlibTrackerWrapper::Process(vector<Mat> backwardImgs, vector<Mat> forwardImgs)\n{\n  vector<Rect> ret;\n\n  DlibTrackerWrapper backTracker(initial_location_);\n\n  for(auto img: backwardImgs)\n    ret.push_back( backTracker.UpdateTracker(img) );\n\n  if(forwardImgs.size() > 0)\n  {\n    std::reverse(ret.begin(), ret.end());\n    std::reverse(backwardImgs.begin(), backwardImgs.end());\n    backwardImgs.pop_back();\n    ret.pop_back(); // remove the middle, it will be added again. This is initial_location_\n  }\n\n  for(auto img: forwardImgs)\n  {\n    ret.push_back( UpdateTracker(img) );\n    backwardImgs.push_back(img);\n  }\n\n  return std::make_pair(backwardImgs, ret);\n}\n\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/dlib-tracker-wrapper.h",
    "content": "/*\n * dlib-tracker-wrapper.h\n *\n *  Created on: 2015-06-22\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef DLIB_TRACKER_WRAPPER_H_\n#define DLIB_TRACKER_WRAPPER_H_\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Ptr;\nusing cv::Scalar;\nusing cv::Rect;\nusing cv::Point;\nusing cv::Size;\n\n#include <dlib/image_processing.h>\n#include <dlib/gui_widgets.h>\n#include <dlib/image_io.h>\n#include <dlib/dir_nav.h>\n\n#include <vector>\nusing std::vector;\nusing std::pair;\n\nnamespace MostCV {\n\nclass DlibTrackerWrapper {\npublic:\n  DlibTrackerWrapper(Rect initial_location);\n\n  Rect UpdateTracker(Mat img);\n  pair<vector<Mat>, vector<Rect> > Process(vector<Mat> backwardImgs, vector<Mat> forwardImgs);\n\nprivate:\n  dlib::correlation_tracker tracker_;\n  Rect initial_location_;\n  int step_;\n};\n\n\n}\n\n#endif /* DLIB_TRACKER_WRAPPER_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/images-utilities.cpp",
    "content": "#include \"images-utilities.h\"\n\n#include <iostream>\nusing std::cout;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\n\n#include \"custom-images-macros.h\"\n#include \"custom-macros.h\"\n\nnamespace MostCV {\n\nvoid ShowImage(Mat image, int wait, bool bShow, string stringWindowName) {\n  if (bShow) {\n    cv::namedWindow(stringWindowName.c_str(), 1);\n    cv::imshow(stringWindowName.c_str(), image);\n    cv::waitKey(wait);\n  }\n}\n\nvoid RemoveImagePixels(Mat img, Mat mask, bool is_mask_remove_pixel_black, Point shift) {\n  REPIMG2(y, x, mask)\n    {\n      if (mask.at<uchar> (y, x) == 0 && !is_mask_remove_pixel_black)\n        continue;\n\n      if (mask.at<uchar> (y, x) > 0 && is_mask_remove_pixel_black)\n        continue;\n\n      if (img.channels() == 3) {\n        for (int c = 0; c < 3; ++c)\n          img.at<cv::Vec3b> (y + shift.y, x + shift.x)[c] = 0;\n      } else\n        img.at<uchar> (y + shift.y, x + shift.x) = 0;\n    }\n}\n\nvoid FixMask(Mat mask, int threshold) {\n  int cnt = 0;\n\n  REPIMG2(y, x, mask)\n    {\n      if (mask.at<uchar> (y, x) >= threshold) {\n        if (mask.at<uchar> (y, x) != 255)\n          cnt++;\n        mask.at<uchar> (y, x) = 255;\n      } else {\n        if (mask.at<uchar> (y, x) != 0)\n          cnt++;\n        mask.at<uchar> (y, x) = 0;\n      }\n    }\n  //if(cnt)    cout<<\"FixMask: \"<<cnt<<\" pixels\\n\";\n}\n\nvoid Morphology(Mat mask, bool do_open, bool do_close, int open_kernel_sz, int close_kernel_sz) {\n\n  Mat open_element = cv::getStructuringElement(0, Size(open_kernel_sz, open_kernel_sz));\n  Mat close_element = cv::getStructuringElement(0, Size(close_kernel_sz, close_kernel_sz));\n\n  if (do_open)\n    cv::morphologyEx(mask, mask, cv::MORPH_OPEN, open_element);\n\n  if (do_close)\n    cv::morphologyEx(mask, mask, cv::MORPH_CLOSE, close_element);\n}\n\nbool AddButton(Mat controlsMat, string buttonName, vector<Rect> &rectsSoFar, Scalar color) {\n  int lastY = 0;\n  int lastX = 0;\n  Rect imgRect = Rect(0, 0, controlsMat.cols - 1, controlsMat.rows - 1);\n\n  if (rectsSoFar.size()) {\n    Rect r = rectsSoFar.back();\n    lastY = r.y + r.height + 5;\n    lastX = r.x;\n  }\n  Rect r(lastX, lastY, 100, 30);\n\n  if ((r & imgRect) != r) {\n    lastY = 0;\n    lastX = r.x + r.width + 5;\n    r = Rect(lastX, lastY, 100, 30);\n\n    if ((r & imgRect) != r)\n      return false;\n  }\n\n  cv::rectangle(controlsMat, r, Scalar(255, 255, 255), 2);\n  cv::putText(controlsMat, buttonName, Point(r.x + 2, r.y + r.height / 2), cv::FONT_HERSHEY_SIMPLEX, 0.5, color);\n\n  rectsSoFar.push_back(r);\n\n  return true;\n}\n\nvector<Ptr<CComponenets> > GetConnectedComponenets(Mat img, int area_threshold, int pixels_threshold, Scalar lo_diff, Scalar up_diff, int flags) {\n\n  assert(area_threshold > 0 && pixels_threshold > 0);\n\n  Mat uchar_img;\n  Rect img_rect(0, 0, img.cols - 1, img.rows - 1);\n  vector<Ptr<CComponenets> > componenets;\n\n  if (img.channels() > 1)\n    cvtColor(img, uchar_img, CV_BGR2GRAY);\n  else\n    img.copyTo(uchar_img);\n\n  REPIMG2(y, x, uchar_img)\n    {\n      int pixel_value = (int) uchar_img.at<uchar> (y, x);\n\n      if (pixel_value < 1)\n        continue;\n\n      Rect rect;\n      Mat mask = Mat::zeros(uchar_img.rows + 2, uchar_img.cols + 2, CV_8UC1);\n\n      int mask_pixels_cnt = floodFill(uchar_img, mask, Point(x, y), Scalar(0), &rect, lo_diff, up_diff, flags);\n\n      rect &= img_rect;\n\n      if (rect.area() >= area_threshold && mask_pixels_cnt >= pixels_threshold) {\n        Ptr<CComponenets> component = new CComponenets();\n\n        MostCV::FixMask(mask);\n\n        componenets.push_back(component);\n        component->mask = mask(Rect(1, 1, uchar_img.cols, uchar_img.rows));\n        component->mask_pixels_cnt = mask_pixels_cnt;\n        component->rect = rect;\n        component->flood_starting_point = Point(x, y);\n        component->parent_mask_topleft_point = Point(0, 0);\n      }\n    }\n  return componenets;\n}\n\nRect GetInternalBlobRect(Mat mask)\n{\n  assert(mask.type() == CV_8UC1);\n\n  vector<Ptr<MostCV::CComponenets> > comps = MostCV::GetConnectedComponenets(mask);\n\n  if(comps.size() == 0)\n    return Rect(0, 0, 1, 1);\n\n  Rect union_rect = comps[0]->rect;\n\n  REP(i, comps)\n    union_rect |= comps[i]->rect;\n\n  return union_rect;\n}\n\nvector<Point> GetCombinedContour(Mat mask) {\n  vector<vector<Point> > contours;\n  vector<cv::Vec4i> hierarchy;\n  Mat componentCpy;\n\n  mask.copyTo(componentCpy);\n  cv::findContours(componentCpy, contours, hierarchy, CV_RETR_EXTERNAL, CV_CHAIN_APPROX_SIMPLE);\n\n  vector<Point> contoursInOne;\n\n  REP(j, contours)\n    contoursInOne.insert(contoursInOne.end(), contours[j].begin(), contours[j].end());\n\n  return contoursInOne;\n}\n\nRect GetRect(Mat img)\n{\n  return Rect(0, 0, img.cols-1, img.rows-1);\n}\n\nvoid CenterRect(Rect &target_rect, int width, int height)\n{\n  if(width > target_rect.width)\n  {\n    target_rect.x -= (width - target_rect.width)/2;\n    target_rect.width = width;\n  }\n\n  if(height > target_rect.height)\n  {\n    target_rect.y -= (height - target_rect.height)/2;\n    target_rect.height = height;\n  }\n}\n\nbool CmpRectTopLeft(const Rect &a, const Rect &b)\n{\n  if(a.y != b.y)\n    return a.y < b.y;\n  return a.x < b.x;\n}\n\nvoid SaveVideo(vector<Mat> images, string path, int fps)\n{\n  if(images.empty())\n  {\n    std::cerr<<\"ERROR: Empty video\\n\";\n    return;\n  }\n\n  cv::VideoWriter videoObject;\n\n  videoObject.open(path, CV_FOURCC('X','V','I','D'), fps, Size(images[0].cols, images[0].rows), true);\n\n  if(!videoObject.isOpened())\n  {\n    std::cerr<<\"ERROR: Problem in out video path: \"<<path<<\"\\n\";\n    assert(false);\n  }\n\n  for(auto img : images)\n    videoObject<<img;\n}\n\n\n\n\n\n\n\n\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/images-utilities.h",
    "content": "/*\n * ImagesHelper.h\n *\n *  Created on: 2015-03-01\n *      Author: mostafa\n */\n\n#ifndef IMAGESHELPER_H_\n#define IMAGESHELPER_H_\n\n#include<string>\n#include<vector>\nusing std::vector;\nusing std::string;\n\n#include \"opencv2/core/core.hpp\"\nusing cv::Mat;\nusing cv::Ptr;\nusing cv::Point;\nusing cv::Rect;\nusing cv::Scalar;\nusing cv::Size;\n\n#include \"custom-images-macros.h\"\n\nnamespace MostCV {\n\nstruct CComponenets {\n  Mat mask;\n  int mask_pixels_cnt;\n  Rect rect;\n  Point flood_starting_point;\n  Point parent_mask_topleft_point;\n};\n\nvoid ShowImage(Mat image, int wait = 0, bool bShow = true, string stringWindowName = \"Image\");\n\nvoid RemoveImagePixels(Mat img, Mat mask, bool is_mask_remove_pixel_black = false, Point shift = Point(0, 0));\n\nvoid FixMask(Mat mask, int threshold = 10);\n\nvoid Morphology(Mat mask, bool do_open = true, bool do_close = true, int open_kernel_sz = 3, int close_kernel_sz = 15);\n\nvector<Ptr<CComponenets> > GetConnectedComponenets(Mat img, int area_threshold = 1, int pixels_threshold = 1, Scalar lo_diff = Scalar(1), Scalar up_diff =\n    Scalar(1), int flags = 4 + (255 << 8));\n\nRect GetRect(Mat img);\n\nRect GetInternalBlobRect(Mat mask);\n\nvoid CenterRect(Rect &target_rect, int width, int height);\n\nvector<Point> GetCombinedContour(Mat mask);\n\nbool AddButton(Mat controlsMat, string buttonName, vector<Rect> &rectsSoFar, Scalar color = Scalar(255, 0, 0));\n\nbool CmpRectTopLeft(const Rect &a, const Rect &b);\n\nvoid SaveVideo(vector<Mat> images, string path, int fps = 25);\n\n////////////////////////////\n\ntemplate<class Type>  Mat ToRowMat(const vector<Type> &row)\n{\n  if(row.size() == 0)\n    return Mat(0, 0, cv::DataType<Type>::type);\n\n  const Type *ptr = &row[0];\n  Mat mat = Mat(1, row.size(), cv::DataType<Type>::type);\n\n  memcpy(mat.data, ptr, row.size()*sizeof(Type));\n\n  //Mat tempMat = Mat(featureVec).t();\n\n  return mat;\n}\n\ntemplate<class Type>  Mat ToColMat(const vector<Type> &col)\n{\n  if(col.size() == 0)\n    return Mat(0, 0, cv::DataType<Type>::type);\n\n  const Type *ptr = &col[0];\n  Mat mat = Mat(col.size(), 1, cv::DataType<Type>::type);\n\n  memcpy(mat.data, ptr, col.size()*sizeof(Type));\n\n  return mat;\n}\n\ntemplate<class Type>  Mat To2DMat(const vector<vector<Type>> & vectors)\n{\n  Mat mat;\n\n  for(auto row : vectors)\n    mat.push_back(ToRowMat(row));\n\n  return mat;\n}\n\n/*\ntemplate<typename function> void perform(function operation, Mat mat) {\n  if(mat.channels() == 2)\n  {\n    REPIMG2(y, x, mat)\n        mat.at<>\n  }\n  else {\n\n  }\n}\n*/\n\n}\n\n#endif /* IMAGESHELPER_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/leveldb-reader.cpp",
    "content": "/*\n * leveldb-reader.cpp\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#include<algorithm>\n#include<iostream>\n#include<fstream>\n\n#include \"leveldb-reader.h\"\nusing std::ifstream;\nusing std::ofstream;\nusing std::endl;\nusing std::cout;\n\n#include \"utilities.h\"\n\nMostCV::LevelDBReader::LevelDBReader(const string & database_path, const string & sorted_list_file) {\n  record_idx_ = 0;\n  cache_limit_ = 1000;\n  database_path_ = database_path;\n\n  is_caching = true;\n\n  if (sorted_list_file == \"\")\n    is_caching = false;\n\n  if (is_caching) {\n    ifstream ifs(sorted_list_file.c_str());\n    string line;\n\n    assert(ifs.is_open());\n\n    while (getline(ifs, line)) {\n      int pos = line.find(' ');\n\n      if (pos != -1)\n        line = line.substr(0, pos);\n\n      pos = line.find_last_of('/');\n\n      if (pos != -1)\n        line = line.substr(pos + 1);\n\n      if (line != \"\")\n        vectors_names_.push_back(line);\n    }\n    vector<string> images_names_temp = vectors_names_;\n    std::sort(images_names_temp.begin(), images_names_temp.end());\n\n    assert(images_names_temp == vectors_names_);\n  }\n\n  leveldb::Options options;\n  options.create_if_missing = true;\n  leveldb::Status status = leveldb::DB::Open(options, database_path_, &database_);\n  assert(status.ok());\n\n  database_iter_ = database_->NewIterator(leveldb::ReadOptions());\n  assert(database_iter_ != NULL);\n\n  database_iter_->SeekToFirst();\n}\n\nMostCV::LevelDBReader::~LevelDBReader() {\n  if (database_iter_ != NULL)\n    delete database_iter_;\n\n  if (database_ != NULL)\n    delete database_;\n}\n\nbool MostCV::LevelDBReader::GetNextEntry(string &key, vector<double> &retVec, int &label) {\n  if (!database_iter_->Valid())\n    return false;\n\n  Datum datum;\n  datum.clear_float_data();\n  datum.clear_data();\n  datum.ParseFromString(database_iter_->value().ToString());\n\n  key = database_iter_->key().ToString();\n  label = datum.label();\n\n  int expected_data_size = std::max<int>(datum.data().size(), datum.float_data_size());\n  const int datum_volume_size = datum.channels() * datum.height() * datum.width();\n  if (expected_data_size != datum_volume_size) {\n    cout << \"Something wrong in saved data.\";\n    assert(false);\n  }\n\n  retVec.resize(datum_volume_size);\n\n  const string& data = datum.data();\n  if (data.size() != 0) {\n    // Data stored in string, e.g. just pixel values of 196608 = 256 * 256 * 3\n    for (int i = 0; i < datum_volume_size; ++i)\n      retVec[i] = data[i];\n  } else {\n    // Data stored in real feature vector such as 4096 from feature extraction\n    for (int i = 0; i < datum_volume_size; ++i)\n      retVec[i] = datum.float_data(i);\n  }\n\n  database_iter_->Next();\n  ++record_idx_;\n\n  return true;\n}\n\nbool MostCV::LevelDBReader::GetNextEntryByKey(const string & name, vector<double> &retVec, int &label) {\n\n  if (!is_caching) {\n    cout << \"A sorted file MUST be given. What are you trying to retrive!\\n\";\n    assert(false);\n  }\n\n  if (cache_.count(name)) {\n    retVec = cache_[name];\n    return true;\n  }\n\n  string key;\n  while (GetNextEntry(key, retVec, label)) {\n    if ((int) cache_items_.size() == cache_limit_) {\n      map<string, vector<double> >::iterator it = cache_.find(cache_items_.front());\n\n      assert(it != cache_.end());\n      cache_.erase(it);\n      cache_items_.pop_front();\n    }\n    cache_[vectors_names_[record_idx_ - 1]] = retVec;\n    cache_items_.push_back(vectors_names_[record_idx_ - 1]);\n\n    if (vectors_names_[record_idx_ - 1] == name)\n      return true;\n  }\n\n  cout << \"Reached end of data: Total Records: \" << record_idx_ << \"\\n\";\n  cout << \"Failed to find data for: \" << name << \" in database path: \" << database_path_ << \"\\n\";\n\n  assert(false);  // We failed to retrieve!\n\n  return false;\n}\n\nvoid MostCV::LevelDBReader::Dump(const string & file_path, int featureVectorLimit) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  ofstream ofs(file_path.c_str());\n\n  vector<double> retVec;\n  string key;\n  int label;\n\n  while (GetNextEntry(key, retVec, label)) {\n    ofs << \"key=\" << key << \", label=\" << label << \", features length=\" << retVec.size();\n\n    if (featureVectorLimit > 0) {\n      ofs << \", truncated\";\n      retVec.resize(featureVectorLimit);  // To avoid writing much\n    }\n\n    ofs << \", feature vec= \";\n    for (size_t i = 0; i < retVec.size(); ++i)\n      ofs << retVec[i] << \" \";\n    ofs << \"\\n\";\n  }\n  ofs.close();\n\n  cout << \"\\nDump done: Total Records: \" << record_idx_ << \"\\n\";\n}\n\nvoid MostCV::LevelDBReader::DumpSmall(const string &file_path, int featureVectorLimit, bool make_random) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  ofstream ofs(file_path.c_str());\n\n  vector<double> retVec;\n\n  string key;\n  int label;\n\n  for (int cnt = 0; cnt < 500 && GetNextEntry(key, retVec, label); ++cnt) {\n    ofs << \"key=\" << key << \", label=\" << label << \", features length=\" << retVec.size();\n\n    if (make_random)\n      std::random_shuffle(retVec.begin(), retVec.end());\n\n    if (featureVectorLimit > 0) {\n      ofs << \", truncated\";\n      retVec.resize(featureVectorLimit);  // To avoid writing much\n    }\n\n    ofs << \", feature vec= \";\n    for (size_t i = 0; i < retVec.size(); ++i)\n      ofs << retVec[i] << \" \";\n    ofs << \"\\n\";\n  }\n  ofs.close();\n\n  cout << \"\\nDump done: Total Records: \" << record_idx_ << \"\\n\";\n}\n\nvoid MostCV::LevelDBReader::ReadLabels(vector<int> &labels, int max_rows) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  labels.clear();\n\n  string key;\n  int label;\n  vector<double> retVec;\n\n  for (int row = 0; GetNextEntry(key, retVec, label); ++row) {\n    if(max_rows != -1 && max_rows == row)\n      break;\n    labels.push_back(label);\n  }\n}\n\nint MostCV::LevelDBReader::GetRecordsCount() {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  string key;\n  int label;\n  vector<double> retVec;\n\n  while (GetNextEntry(key, retVec, label))\n    ;\n\n  return record_idx_;\n}\n\nvoid MostCV::LevelDBReader::SeekToHead() {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n}\n\n\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/leveldb-reader.h",
    "content": "/*\n * leveldb-reader.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n/*\n * The file handles the reading of leveldb files. The database hold set of feature vectors of same length.\n */\n\n#ifndef LEVELDB_READER_H_\n#define LEVELDB_READER_H_\n\n#include <stdio.h>\n\n#include <string>\n#include <vector>\n#include <deque>\n#include <cassert>\n#include <iostream>\n#include <fstream>\n#include <map>\nusing std::map;\nusing std::deque;\nusing std::vector;\nusing std::string;\nusing std::endl;\nusing std::cout;\n\n#include <google/protobuf/text_format.h>\n#include <glog/logging.h>\n\n#include <leveldb/db.h>\n#include <leveldb/write_batch.h>\n\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\n\nnamespace MostCV {\n\n/*\n * The class opens a leveldb directory, which has set of feature vectors (e.g. extracted by feature_extract tool from caffe tool).\n * In order, each feature vector has a name is a given in a \"sorted\" file.\n * User either can retrieve all feature vectors in order or filter based on name.\n *\n * User is expected to use one type only of the GetNextEntry methods. Similarly, if user used Dump method, shouldn't try to use other methods.\n * Reason behind such limitation: All the methods seek in the file. E.g., after dumping, no more rows to read.\n *\n * Usage Example:\n *\n * LevelDBReader reader(database_path, sorted_images_list_file);\n * vector<double> feature_vector;\n *\n * while(reader.GetNextEntry())\n *  doSomething(feature_vector);\n *\n */\nclass LevelDBReader {\npublic:\n  /*\n   * Open and prepare the database for reading. The database is allowed to have more rows than the file such that extra rows has no corresponding name.\n   *\n   * The file names should be sorted. Reason behind that is allowing efficient retrieval (e.g. using caching to last 200 rows). As a result, leveldb should be sorted too based on this key.\n   *\n   * In case no file given, then Just retrieve sequentially from DB. This is more suitable for dumping purposes.\n   */\n  LevelDBReader(const string & database_path, const string & sorted_list_file = \"\");\n  ~LevelDBReader();\n\n  // Read the next entry from the file. If no more rows, return false.\n  bool GetNextEntry(string &key, vector<double> &retVec, int &label);\n\n  // Given entry name from the sorted_images_list_file, return corresponding vector. Consecutive calls should be ordered in name.\n  //    If not so, it shouldn't be far from the last sorted element to be retrieved from caching. We cache last X elements.\n  bool GetNextEntryByKey(const string & name, vector<double> &retVec, int &label);\n\n  // For debugging purposes, dump the database to a file. Truncate after the first \"limit\" elements.\n  void Dump(const string &file_path, int featureVectorLimit = -1);\n  void DumpSmall(const string &file_path, int featureVectorLimit = -1, bool make_random = true);\n  void ReadLabels(vector<int> &labels, int max_rows = -1);\n  int GetRecordsCount();\n  void SeekToHead();\n\nprivate:\n  bool is_caching;\n  vector<string> vectors_names_;\n  string database_path_;\n\n  leveldb::DB* database_;\n  leveldb::Iterator* database_iter_;\n\n  // Caching Variables\n  map<string, vector<double> > cache_;\n  deque<string> cache_items_;\n  int cache_limit_;\n\n  // Current row index in retrieval\n  int record_idx_;\n};\n\n}\n\n#endif /* LEVELDB_READER_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/leveldb-writer.cpp",
    "content": "/*\n * LeveldbWriter.cpp\n *\n *  Created on: 2015-04-02\n *      Author: Moustafa S. Ibrahim\n */\n\n#include <iostream>\n\n#include \"leveldb-writer.h\"\nusing std::cerr;\nusing std::cout;\n\n#include \"utilities.h\"\n\nconst int WRITING_LIMIT = 1000;\n\nnamespace MostCV {\n\nLeveldbWriter::LeveldbWriter(string db_path_, int resize_height_, int resize_width_, int volumeSize, bool is_virtual_) {\n  max_label_cnt = -1;\n  db_path = db_path_;\n  resize_height = resize_height_;\n  resize_width = resize_width_;\n  volume_size = volumeSize;\n  is_virtual = is_virtual_;\n\n  cerr<<\"\\n\\nCreates a database at: \"<<db_path_<<\"\\n\";\n\n  if(is_virtual_)\n    cerr<<\"\\tUing VIRTUAL MODE dataset\\n\\n\";\n\n  countId = 0;\n  lastCountId = 0;\n  internal_idx = 1;\n\n  if (resize_height > 0) {  // then something already defined for the shape\n    datum.set_channels(volume_size);\n    datum.set_height(resize_height);\n    datum.set_width(resize_width);\n\n    cerr<<\"\\t(H, W, C) = \"<<resize_height<<\" \"<<resize_width<<\" \"<<volume_size<<\"\\n\";\n  }\n  if(!is_virtual) {\n    // leveldb\n    leveldb::Options options;\n    options.error_if_exists = true;\n    options.create_if_missing = true;\n    options.write_buffer_size = 268435456;\t// 8 * 32 * 1024 * 1024\n\n    // Open db\n    LOG(INFO)<< \"Opening leveldb \" << db_path;\n    leveldb::Status status = leveldb::DB::Open(options, db_path, &db);\n    CHECK(status.ok()) << \"Failed to open leveldb \" << db_path << \". Is it already existing?\";\n    batch = new leveldb::WriteBatch();\n  }\n\n  is_closed = false;\n}\n\nLeveldbWriter::~LeveldbWriter() {\n  forceFinalize();\n}\n\nvoid LeveldbWriter::clearDatum() {\n  assert(!is_closed);\n\n  datum.clear_data();\n  datum.clear_float_data();\n}\n\nvoid LeveldbWriter::setLabelsRange(int max_label_cnt) {\n  assert(!is_closed);\n\n  this->max_label_cnt = max_label_cnt;\n}\n\nvoid LeveldbWriter::setDatumLabel(int id) {\n  assert(!is_closed);\n\n  assert(id >= 0);\n\n  if (max_label_cnt != -1 && id >= max_label_cnt) {\n    cerr << \"Wrong label! (Received, expected) = \" << id << \" - \" << max_label_cnt << \"\\n\";\n    assert(false);\n  }\n\n  datum.set_label(id);\n  labels.insert(id);\n  labelsVec.push_back(id);\n}\n\nvoid LeveldbWriter::addDatumToBatch(string key) {\n  assert(!is_closed);\n\n  if (key != \"\" && keys.insert(key).second == false)\n    cerr << \"Warning: key duplication: \" << key << \"\\n\";\n\n  if(is_virtual)  return;\n\n  string value;\n  assert(datum.SerializeToString(&value));\n\n  string prefix = MostCV::toIntStr(\"0000000\", internal_idx++) + \"@\";\n  batch->Put(prefix + key, value);\n\n  if (++countId % WRITING_LIMIT == 0)\n    writeBatch();\n\n  clearDatum();\n}\n\nvoid LeveldbWriter::addDatumToBatch(caffe::Datum &datum, string key, int label) {\n  assert(!is_closed);\n\n  if (keys.insert(key).second == false)\n    cerr << \"Warning: Key duplication: \" << key << \"\\n\";\n\n  assert(label >= 0);\n\n  string value;\n  datum.set_label(label);\n  labels.insert(label);\n  labelsVec.push_back(label);\n\n  if(is_virtual)  return;\n\n  assert(datum.SerializeToString(&value));\n\n  string prefix = MostCV::toIntStr(\"0000000\", internal_idx++) + \"@\";\n  batch->Put(prefix + key, value);\n\n  if (++countId % WRITING_LIMIT == 0)\n    writeBatch();\n\n  clearDatum();\n}\n\nbool LeveldbWriter::addVectorDatum(const vector<double> &feature_vec) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  clearDatum();\n\n  if (resize_height <= 0) {  // use first vector to define the outline\n    datum.set_height(resize_height = feature_vec.size());\n    datum.set_channels(1);\n    datum.set_width(1);\n  } else\n    assert((int )feature_vec.size() == resize_height * resize_width * volume_size);\n\n  for (int p = 0; p < (int) feature_vec.size(); ++p)\n    datum.add_float_data(feature_vec[p]);\n\n  return true;\n}\n\nbool LeveldbWriter::addImageToDatum(Mat imgMat_origin, int num_channels) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  assert(resize_width > 0 && resize_height > 0);\n  assert(imgMat_origin.channels() == num_channels);  // Weird to send it :D\n\n  Mat imgMat;\n  cv::resize(imgMat_origin, imgMat, Size(resize_width, resize_height));\n  // add to db: 256 * 256 * 3 = 196608\n\n  string* datum_string = datum.mutable_data();\n\n  if (num_channels == 3) {\n    for (int c = 0; c < num_channels; ++c) {\n      for (int h = 0; h < imgMat.rows; ++h) {\n        for (int w = 0; w < imgMat.cols; ++w) {\n          datum_string->push_back(static_cast<uint8_t>(imgMat.at<cv::Vec3b>(h, w)[c]));\n        }\n      }\n    }\n  } else {\n    for (int h = 0; h < imgMat.rows; ++h) {\n      for (int w = 0; w < imgMat.cols; ++w) {\n        datum_string->push_back(static_cast<uint8_t>(imgMat.at<uchar>(h, w)));\n      }\n    }\n  }\n\n  return true;\n}\n\nbool LeveldbWriter::addImageToDatum(const string& filename, int num_channels) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  int cv_read_flag = (num_channels == 3 ? CV_LOAD_IMAGE_COLOR : CV_LOAD_IMAGE_GRAYSCALE);\n\n  Mat imgMat_origin = cv::imread(filename, cv_read_flag);\n\n  if (!imgMat_origin.data) {\n    LOG(ERROR)<< \"Could not open or find file \" << filename;\n    return false;\n  }\n  return addImageToDatum(imgMat_origin, num_channels);\n}\n\nvoid LeveldbWriter::writeBatch() {\n  if (is_closed)\n    return;\n\n  if(is_virtual)  return;\n\n  if (countId == lastCountId)  // nothing changed\n    return;\n\n  leveldb::Status status = db->Write(leveldb::WriteOptions(), batch);\n  CHECK(status.ok()) << \"Failed to write the batch. Count id:  \" << countId << \"\\n\";\n\n  delete batch;\n  batch = new leveldb::WriteBatch();\n\n  LOG(ERROR)<<db_path<<\": Processed \" << countId << \" files.\";\n  lastCountId = countId;\n}\n\nvoid LeveldbWriter::forceFinalize() {\n  if (is_closed)\n    return;\n\n  if(!is_virtual) {\n\n    // write the last batch\n    if (countId % WRITING_LIMIT != 0)\n      writeBatch();\n\n    if (batch != NULL)\n      delete batch;\n\n    if (db != NULL)\n      delete db;\n  }\n\n  if (labels.size() == 1)  // Zero case, means caller not interested in setting labels. Just dummy labels.\n    cerr << \"\\n\\n\\nThere is only ONE label in database. There should be a bug\\n\";\n\n  cerr<<\"\\nLabels Statistics for db \"<<db_path<<\"\\n\";\n\n  cerr<<\"Total Records \"<<labelsVec.size()<<\"\\n\";\n\n  cerr<<\"*********************************************************\\n\";\n\n  MostCV::getFrequencyMap(labelsVec, true);\n\n  cerr<<\"*********************************************************\\n\";\n\n  MostCV::getFrequencyMapPercent(labelsVec, true);\n\n  is_closed = true;\n}\n\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/leveldb-writer.h",
    "content": "/*\n * LeveldbWriter.h\n *\n *  Created on: 2015-04-02\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef LeveldbWriter_H_\n#define LeveldbWriter_H_\n\n#include <string>\n#include <set>\n#include <vector>\nusing std::vector;\nusing std::set;\nusing std::string;\n\n#include <glog/logging.h>\n#include <leveldb/db.h>\n#include <leveldb/write_batch.h>\n#include \"caffe/proto/caffe.pb.h\"\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Size;\n\n\nnamespace MostCV {\n\nclass LeveldbWriter {\npublic:\n  // Using zero parameters would mean not interested to add addImageToDatum functionality.\n  LeveldbWriter(string db_path, int resize_height = -1, int resize_width = 1, int volumeSize = 1, bool is_virtual = false);\n  ~LeveldbWriter();\n\n  void clearDatum();\n  void setDatumLabel(int id);\n  bool addImageToDatum(const string& filename, int num_channels);\n  bool addImageToDatum(Mat img, int num_channels);\n\n  bool addVectorDatum(const vector<double> &feature_vec);\n  void addDatumToBatch(string key = \"\");\n  void addDatumToBatch(caffe::Datum &datum, string key, int label);\n\n  void setLabelsRange(int max_label_cnt);\n  void forceFinalize();\n\nprivate:\n  void writeBatch();\n\n\n  leveldb::DB* db;\n  leveldb::WriteBatch* batch;\n  caffe::Datum datum;\n  int countId;\n  int lastCountId;\n\n  string db_path;\n  int resize_height;\n  int resize_width;\n  int volume_size;\n  int internal_idx;\n\n  set<int> labels;          //helps in verification.\n  vector<int> labelsVec;    // print purposes\n  int max_label_cnt;\n  set<string> keys;\n  bool is_closed;\n  bool is_virtual;\n};\n\n}\n\n#endif /* LeveldbWriter_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/rect-helper.cpp",
    "content": "/*\n * RectHelper.cpp\n *\n *  Created on: 2015-07-06\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"rect-helper.h\"\n\n#include \"images-utilities.h\"\n#include \"utilities.h\"\n#include \"custom-macros.h\"\n\nnamespace MostCV {\n\nRectHelper::RectHelper(Rect rect, double score) {\n  r = rect;\n  conf_score = score;\n  color = Scalar(rand() % 256, rand() % 256, rand() % 256); // random color\n}\n\nvector<RectHelper> RectHelper::ToRectHelpers(const vector<Rect> &rectangles_vec) {\n  vector<RectHelper> ret;\n\n  for(auto rect : rectangles_vec)\n    ret.push_back(RectHelper(rect));\n\n  return ret;\n}\n\nvector<Rect> RectHelper::ToRects(const vector<RectHelper> &rectangles_vec)\n{\n  vector<Rect> ret;\n\n  for(auto rect : rectangles_vec)\n    ret.push_back(rect.r);\n\n  return ret;\n}\n\n//////////////////////////// Static Methods /////////////////////////////\n\nvoid RectHelper::DrawRects(Mat img, const vector<RectHelper> &rectangles_vec, bool is_make_copy, bool is_show, Scalar color) {\n  Mat imgTemp;\n\n  if (is_make_copy) {\n    img.copyTo(imgTemp);\n    img = imgTemp;\n  }\n\n  for (auto rect_helper : rectangles_vec)\n    cv::rectangle(img, rect_helper.r, (color[0] == -1) ? rect_helper.color : color, 2);\n\n  int maxArea = 600 * 800;\n  int dif = sqrt(img.rows * img.cols / maxArea);\n\n  if(dif > 1)\n  {\n    Size size(img.cols / dif, img.rows / dif);\n    Mat toImg;\n    cv::resize(img, toImg, size);\n    img = toImg;\n  }\n\n  MostCV::ShowImage(img, 0, is_show);\n}\n\nmap<string, vector<RectHelper> > RectHelper::LoadImagesRectangles(string path_x1_y1_w_h){\n  map<string, vector<RectHelper> > retMap;\n\n  ifstream ifs(path_x1_y1_w_h);\n\n  int cnt;\n  string image_name;\n\n  while(ifs>>image_name>>cnt)\n  {\n    vector<RectHelper> rectHelpers;\n\n    while(cnt--)\n    {\n      double x, y, w, h;\n      double score;\n      ifs>>x>>y>>w>>h>>score;\n\n      rectHelpers.push_back(RectHelper(Rect(x, y, w, h), score));\n    }\n    retMap[image_name] = rectHelpers;\n  }\n  ifs.close();\n\n  return retMap;\n}\n\nvoid RectHelper::WriteImagesRectangles(const map<string, vector<RectHelper> > &image_rect_helpers_Map, string path_x1_y1_w_h)\n{\n  ofstream ofs(path_x1_y1_w_h);\n\n  for (auto img_rects_pair : image_rect_helpers_Map)\n  {\n    ofs<<img_rects_pair.first<<\" \"<<img_rects_pair.second.size();\n\n    for (auto rectHelper: img_rects_pair.second)\n      ofs<<\" \"<<rectHelper.r.x<<\" \"<<rectHelper.r.y<<\" \"<<rectHelper.r.width<<\" \"<<rectHelper.r.height<<\" \"<<rectHelper.conf_score;\n    ofs<<\"\\n\";\n  }\n  ofs.close();\n}\n\nvoid RectHelper::FilterBelowConfidenceThreshold(vector<RectHelper> &rects, double conf_score_threshold)\n{\n  for (size_t i = 0; i < rects.size(); ++i) {\n    if(MostCV::dcmp(rects[i].conf_score, conf_score_threshold) < 0)\n    {\n      rects.erase(rects.begin() + i);\n      --i;\n    }\n  }\n}\n\n\nbool __CmpSortByConfidence(const RectHelper &a, const RectHelper& b)\n{\n  return MostCV::dcmp(a.conf_score, b.conf_score) < 0;\n}\n\nvoid RectHelper::SortByConfidence(vector<RectHelper> &rects)\n{\n  sort(RALL(rects), __CmpSortByConfidence);\n}\n\nbool __CmpSortByTopLeftPoint(const RectHelper &a, const RectHelper& b)\n{\n  int d = MostCV::dcmp(a.r.x, b.r.x);\n\n  if(d != 0)\n    return d < 0;\n  return MostCV::dcmp(a.r.y, b.r.y) < 0;\n}\n\nvoid RectHelper::SortByTopLeftPoint(vector<RectHelper> &rects)\n{\n  sort(RALL(rects), __CmpSortByTopLeftPoint);\n}\n\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/rect-helper.h",
    "content": "/*\n * RectHelper.h\n *\n *  Created on: 2015-07-06\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef RECTHELPER_H_\n#define RECTHELPER_H_\n\n#include <iostream>\n#include <fstream>\n#include <vector>\n#include <string>\n#include <map>\nusing std::vector;\nusing std::map;\nusing std::string;\nusing std::endl;\nusing std::cout;\nusing std::ifstream;\nusing std::ofstream;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Scalar;\nusing cv::Rect;\nusing cv::Point;\nusing cv::Size;\n\nnamespace MostCV {\n\nclass RectHelper {\npublic:\n\n  RectHelper(Rect rect = Rect(0, 0, 0, 0), double score = -1);\n\n  static vector<RectHelper> ToRectHelpers(const vector<Rect> &rectangles_vec);\n  static vector<Rect> ToRects(const vector<RectHelper> &rectangles_vec);\n  static void DrawRects(Mat img, const vector<RectHelper> &rectangles_vec, bool is_make_copy = true, bool is_show = true, Scalar color = Scalar(-1, -1, -1));\n  static void SortByConfidence(vector<RectHelper> &rects);\n  static void SortByTopLeftPoint(vector<RectHelper> &rects);\n  static void FilterBelowConfidenceThreshold(vector<RectHelper> &rects, double conf_score_threshold);\n  static map<string, vector<RectHelper> > LoadImagesRectangles(string path_x1_y1_w_h);\n  static void WriteImagesRectangles(const map<string, vector<RectHelper> > &imageRectHelpersMap, string path_x1_y1_w_h);\n\n  Rect r;\n  double conf_score;\n  string category;  // E.g. Car bbox\n  int category_idx;\n  Scalar color;   // For drawing\n\n  Mat image;  // Image the rectangle belong to it\n  string image_name;\n  string image_path;\n  string image_parent_path;\n};\n\n}\n\n#endif /* RECTHELPER_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/utilities.cpp",
    "content": "/*\n * Utilities.cpp\n *\n *  Created on: 2015-03-13\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"utilities.h\"\n\n#include <stdio.h>\n#include <stdlib.h>\n\n#include <cstring>\n#include <cmath>\nusing std::memcpy;\nusing std::fabs;\n\n#include <boost/filesystem.hpp>\n#include <boost/algorithm/string/predicate.hpp>\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\nnamespace bst_fs = boost::filesystem;\nusing namespace boost::filesystem;\n\n#include \"custom-abbreviation.h\"\n\nnamespace MostCV {\n\nint dcmp(double x, double y) {\n\treturn fabs(x - y) <= EPS ? 0 : x < y ? -1 : 1;\n}\n\nmap<string, int> BuildStringIdMap(set<string> classes) {\n\tmap<string, int> classId;\n\n\tREPIT(strIt, classes)\n\t{\n\t\tstring str = *strIt;\n\n\t\tif (classId.count(str) == 0) {\n\t\t\tint sz = classId.size();\n\t\t\tclassId[str] = sz;\n\t\t}\n\t}\n\n\treturn classId;\n}\n\nmap<string, int> BuildStringIdMap(vector<string> classesVec) {\n\n\tset<string> classes(classesVec.begin(), classesVec.end());\n\n\treturn BuildStringIdMap(classes);\n}\n\nint UpdateStringIdMap(map<string, int> &classId, string str) {\n\tif (classId.count(str) == 0) {\n\t\tint sz = classId.size();\n\t\tclassId[str] = sz;\n\t\treturn sz;\n\t}\n\treturn classId[str];\n}\n\ndouble round(double d, int precision) {\n\tostringstream oss;\n\toss.setf(std::ios::fixed);\n\toss.precision(precision);\n\toss << d;\n\n\tistringstream iss(oss.str());\n\tiss >> d;\n\treturn d;\n}\n\nvoid fixDir(string &dir) {\n\tif (SZ(dir) == 0)\n\t\treturn;\n\n\tif (dir[SZ(dir) - 1] != PATH_SEP)\n\t\tdir += PATH_SEP;\n}\n\nstring getFileName(string dir) {\n\tint idx = dir.find_last_of(PATH_SEP);\n\n\tif (idx == -1)\n\t\treturn dir;\n\n\treturn dir.substr(idx + 1);\n}\n\nbool fileExist(string szFilePath, bool print) {\n\tifstream fin(szFilePath.c_str());\n\n\tif (!fin) {\n\t\tif (print)\n\t\t\tprintf(\"fileExist: Failed to open file [%s]\\n\", szFilePath.c_str());\n\t\treturn false;\n\t}\n\tfin.close();\n\treturn true;\n}\n\nstring trim(string str) {\n\tint s = 0, e = SZ(str) - 1;\n\tREP(i, str)\n\t{\n\t\tif (!isspace(str[i]))\n\t\t\tbreak;\n\t\ts++;\n\t}\n\n\tLPD(i, SZ(str)-1, 0)\n\t{\n\t\tif (!isspace(str[i]))\n\t\t\tbreak;\n\t\te--;\n\t}\n\n\tif (s > e)\n\t\treturn \"\";\n\treturn str.substr(s, e - s + 1);\n}\n\nstring toLower(string str) {\n\tstring ret = \"\";\n\tREP(i, str)\n\t\tret += tolower(str[i]);\n\treturn ret;\n}\n\nstring toUpper(string str) {\n\tstring ret = \"\";\n\tREP(i, str)\n\t\tret += toupper(str[i]);\n\treturn ret;\n}\n\nbool startsWith(string str, string pat) {\n\treturn (int) str.find(pat) == 0;\n}\n\nint random(int range) {\n\treturn rand() % range;\n}\n\nchar* toCharArr(string str) {\n\tchar *s = new char[SZ(str) + 1];\n\ts[SZ(str)] = '\\0';\n\tmemcpy(s, str.c_str(), SZ(str));\n\treturn s;\n}\n\nstring toIntStr(string st, int add, bool append_zeros) {\n\tint val = toType(st, 1);\n\tval += add;\n\tstring ret = toString(val);\n\n\tif (append_zeros && ret.size() < st.size())\n\t\tret = string(st.size() - ret.size(), '0') + ret;  //pad zeros\n\treturn ret;\n}\n\nstring removeExt(string name) {\n\tint pos = name.find_last_of('.');\n\n\tif (pos != -1)\n\t\tname = name.substr(0, pos);\n\treturn name;\n}\n\nbool IsPathExist(string path) {\n\treturn boost::filesystem::exists(path);\n}\n\nint CountFileLines(string path)\n{\n\t std::ifstream inFile(path);\n\n\t if(inFile.fail())\n\t {\n\t\t cerr<<\"Couldn't open path: \"<<path<<\"\\n\";\n\n\t\t assert(false);\n\t }\n\n\t int ans = std::count(std::istreambuf_iterator<char>(inFile), std::istreambuf_iterator<char>(), '\\n');\n\n\t inFile.close();\n\n\t return ans;\n}\n\nvector<int> GetPerm(int length, int seed)\n{\n\tboost::mt19937 randGenerator(seed);\n\tboost::uniform_int<> uniform_int_dist;\n\tboost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(randGenerator, uniform_int_dist);\n\n\tvector<int> perm(length);\n\n\tfor (int i = 0; i < (int) perm.size(); ++i)\n\t\tperm[i] = i;\n\n\treturn perm;\n}\n\nstring consumeStringParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, string(\"\"), variable_name);\n}\n\nint consumeIntParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, 1, variable_name);\n}\n\ndouble consumeDoubleParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, 1.0, variable_name);\n}\n\nvector<string> GetDirs(string szRoot) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_directory((itr->status())))\n\t\t\tret.push_back(path_str);\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetDirsNames(string szRoot) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_directory((itr->status())))\n\t\t\tret.push_back(itr->path().filename().c_str());\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFiles(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(path_str);\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFilesExt(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(itr->path().extension().c_str());\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFilesNames(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(itr->path().filename().c_str());\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\n\treturn ret;\n}\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n}\n\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/utilities.h",
    "content": "/*\n * general_utilities.h\n *\n *  Created on: 2015-03-11\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef GENERAL_UTILITIES_H_\n#define GENERAL_UTILITIES_H_\n\n#include \"custom-macros.h\"\n\n#include <assert.h>\n#include<string>\n#include<vector>\n#include<map>\n#include<set>\n#include<iostream>\n#include<sstream>\n#include<fstream>\n\nusing std::string;\nusing std::ostringstream;\nusing std::istringstream;\nusing std::ifstream;\nusing std::set;\nusing std::map;\nusing std::vector;\nusing std::cout;\nusing std::cerr;\nusing std::pair;\n\nnamespace MostCV {\n\nconst char PATH_SEP = '/';\n\nint dcmp(double x, double y);\n\ndouble round(double d, int precision);\n\nvoid fixDir(string &dir);\n\nbool IsPathExist(string path);\n\nstring getFileName(string dir);\n\nbool fileExist(string szFilePath, bool print = true);\n\nstring trim(string str);\n\nstring toLower(string str);\n\nstring toUpper(string str);\n\nbool startsWith(string str, string pat);\n\nint random(int range);\n\nchar* toCharArr(string str);\n\nstring toIntStr(string st, int add, bool append_zeros = true);\n\nstring removeExt(string name);\n\nmap<string, int> BuildStringIdMap(set<string> classId);\n\nmap<string, int> BuildStringIdMap(vector<string> classesVec);\n\nint UpdateStringIdMap(map<string, int> &items_map, string str);\n\nint CountFileLines(string path);\n\nvector<int> GetPerm(int length, int seed = 123);\n\nstring consumeStringParam(int &argc, char** &argv, string variable_name = \"\");\nint consumeIntParam(int &argc, char** &argv, string variable_name = \"\");\ndouble consumeDoubleParam(int &argc, char** &argv, string variable_name = \"\");\n\nvector<string> GetDirs(string szRoot);\nvector<string> GetDirsNames(string szRoot);\nvector<string> GetFiles(string szRoot, string endwith = \"\");\nvector<string> GetFilesExt(string szRoot, string endwith = \"\");\nvector<string> GetFilesNames(string szRoot, string endwith = \"\");\n\n////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////\n\ntemplate<class Type> Type toType(string data, Type indicator, string variable_name = \"\") {\n  istringstream iss(data);\n\n  Type item;\n  iss >> item;\n\n  if(iss.fail())\n  {\n    if(variable_name != \"\")\n      cerr<<\"Problem in reading variable: \"<<variable_name<<\"\\n\";\n    cerr<<\"Failed to convert string: [\"<<data<<\"] to same type as variable [\"<<indicator<<\"]\\n\";\n    assert(false);\n  }\n\n  return item;\n}\n\ntemplate<class Type> Type consumeParam(int &argc, char** &argv, Type indicator, string variable_name = \"\")\n{\n  assert(argc > 0);\n  string ret = argv[0];\n  --argc, ++argv;\n  return toType(ret, indicator, variable_name);\n}\n\n\n\ntemplate<class Type> char* toCharPtr(Type val) {\n  ostringstream oss;\n  oss << val;\n  return toCharArr(oss.str());\n}\n\ntemplate<class Type> string toString(Type val) {\n  ostringstream oss;\n  oss << val;\n  return oss.str();\n}\n\ntemplate<class Type> vector<Type> readStringItems(string data, Type indicator) {\n  vector<Type> items;\n  Type item;\n\n  istringstream iss(data);\n\n  while (iss >> item)\n    items.push_back(item);\n\n  return items;\n}\n\ntemplate<class Type> vector<Type> readFileItems(string filePath, Type indicator, bool print = true) {\n  vector<Type> items;\n  Type item;\n\n  ifstream fin(filePath.c_str());\n  if (!fin) {\n    if (print)\n      printf(\"\\n\\tWARNING: readFileItems: Failed to open file [%s]\\n\", filePath.c_str());\n    fflush(stdout);\n    return items;\n  }\n\n  while (fin >> item)\n    items.push_back(item);\n\n  fin.close();\n  return items;\n}\n\ntemplate<class Type> vector<Type> readFileItems(ifstream &fin, Type indicator, int length = -1) {\n  Type item;\n  vector<Type> items;\n\n  if(length == -1)\n  {\n\t  while (items.sizefin >> item)\n\t      items.push_back(item);\n  }\n  else\n  {\n\t  items.resize(length);\n\n\t  for (int pos = 0; pos < items.size(); ++pos)\n\t  {\n\t\t  fin >> item;\n\n\t\t  assert(!fin.fail());\n\n\t\t  items[pos] = item;\n\t  }\n  }\n\n\n  return items;\n}\n\ntemplate<class Type> vector<Type> readStreamItems(istringstream &iss, Type indicator, int length = -1) {\n  Type item;\n  vector<Type> items;\n\n  if(length == -1)\n  {\n\t  while (iss >> item)\n\t      items.push_back(item);\n  }\n  else\n  {\n\t  items.resize(length);\n\n\t  for (int pos = 0; pos < items.size(); ++pos)\n\t  {\n\t\t  iss >> item;\n\n\t\t  assert(!iss.fail());\n\n\t\t  items[pos] = item;\n\t  }\n  }\n\n\n  return items;\n}\n\n\ntemplate<class Type> vector<vector<Type> > read2dFileItems(string filePath, Type indicator, bool print = true) {\n  vector<vector<Type> > items;\n\n  ifstream fin(filePath.c_str());\n\n  if (fin.fail()) {\n      printf(\"read2dFileItems: Failed to open file [%s]\\n\", filePath.c_str());\n      assert(false);\n  }\n\n  string line;\n  while (getline(fin, line))\n  {\n\t  if(line != \"\")\n\t\t  items.push_back(readStringItems(line, indicator));\n  }\n\n  return items;\n}\n\n\n\n// For every element that has max frequency, add its position. Total elements equal to # of unqiue elements\n// 2 3 2 2 2 2 4 4    => 0 6 1\ntemplate<class Type> vector<int> getMaxFrequentPositions(vector<Type> &vec) {\n  vector<int> retVec;\n  map<Type, vector<int> > freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]].push_back(i);\n\n  set<pair<int, vector<int> >, std::greater<pair<int, vector<int> > > > freqs;\n\n  for (auto kv : freq_map)\n    freqs.insert(std::make_pair(kv.second.size(), kv.second));\n\n  for (auto group : freqs)\n    retVec.push_back(group.second[0]);\n\n  return retVec;\n}\n\ntemplate<class Type> Type getMaxFrequentLabel(vector<Type> &vec)\n{\n  assert(vec.size() > 0);\n\n  vector<int> pos = getMaxFrequentPositions(vec);\n\n  return vec[ pos[0] ];\n}\n\ntemplate<class Type> map<Type, int> getFrequencyMap(const vector<Type> &vec, bool print = false) {\n  map<Type, int> freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]]++;\n\n  if (print) {\n    for (auto kv : freq_map)\n      cerr << \"Key = \"<<kv.first << \"\\t => Value \" << kv.second << \" instances\\n\";\n  }\n\n  return freq_map;\n}\n\ntemplate<class Type> map<Type, int> getFrequencyMapPercent(vector<Type> &vec, bool print = false) {\n  map<Type, int> freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]]++;\n\n  if (print) {\n    cerr.precision(1);\n    cerr.setf(std::ios::fixed);\n\n    for (auto kv : freq_map)\n      cerr << \"Key = \"<<kv.first << \"\\t => Value \" << 100.0 * kv.second / (double)vec.size()<< \" %\\n\";\n  }\n\n  return freq_map;\n}\n\ntemplate<class Type1, class Type2> vector<Type2> castVector(const vector<Type1> &row, Type2 indicator) {\n  vector<Type2> ret;\n\n  ret.reserve(row.size());\n\n  for(auto val : row)\n    ret.push_back((Type2)val);\n\n  return ret;\n}\n\ntemplate<class Type1, class Type2> vector<vector<Type2>> cast2DVector(const vector<vector<Type1>> &matrix, Type2 indicator) {\n  vector<vector<Type2>> ret;\n\n  ret.reserve(matrix.size());\n\n  for(auto row : matrix)\n    ret.push_back(castVector(row, indicator));\n\n  return ret;\n}\n\n\n}\n\n#endif /* GENERAL_UTILITIES_H_ */\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/volleyball-dataset-mgr.cpp",
    "content": "/*\n * volleyball-dataset-mgr.cpp\n *\n *  Created on: Nov 28, 2015\n *      Author: msibrahi\n */\n\n#include \"volleyball-dataset-mgr.h\"\n\n#include <boost/filesystem.hpp>\nnamespace bst_fs = boost::filesystem;\n\n#include \"utilities.h\"\n#include \"images-utilities.h\"\n\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\n\nnamespace MostCV {\n\nmap<string, int> global_video_id_frame_id_to_activityId;\nmap<string, vector<VolleyballPerson> > global_video_id_frame_id_to_persons;\n\nmap<string, int> persons_actions_ids_map;\nmap<string, int> scene_activities_ids_map;\n\n// statistics\nmap<string, int> scene_activities_freq_map;\nmap<string, int> players_activities_freq_map;\n\nVolleyballVideoData::VolleyballVideoData(string video_id, string video_dir) {\n  MostCV::fixDir(video_dir);\n\n  video_id_ = video_id;\n  video_dir_ = video_dir;\n\n  string annot_file = video_dir + \"annotations.txt\";\n  vector<vector<string> > data2dVec = MostCV::read2dFileItems(annot_file, string(\"\"), false);\n\n  // For every frame, read the players in it\n  for (auto frame_data : data2dVec) {\n    VolleyballPerson person;\n\n    string frame_id = frame_data[0];\n\n    GetFramePath(frame_id); // verify on hard disk\n\n    frame_data.erase(frame_data.begin());\n\n   // if (frame_data[0].find(\"win\") == string::npos)\n   //   continue;\n\n    scene_activities_freq_map[ frame_data[0] ]++;\n\n    int frame_activity_id = MostCV::UpdateStringIdMap(scene_activities_ids_map, frame_data[0]);\n    annot_frame_id_to_activity_id_map_[frame_id] = frame_activity_id;\n    frame_data.erase(frame_data.begin());\n\n    pair<int, int> min_max_persons_y = { 10000, 0 };\n\n    for (int k = 0; k < (int) frame_data.size(); k += 5) {\n      int x = MostCV::toType(frame_data[k + 0], 0);\n      int y = MostCV::toType(frame_data[k + 1], 0);\n      int w = MostCV::toType(frame_data[k + 2], 0);\n      int h = MostCV::toType(frame_data[k + 3], 0);\n      string activity_str = frame_data[k + 4];\n\n      players_activities_freq_map[activity_str]++;\n\n      min_max_persons_y.first = std::min(min_max_persons_y.first, y);\n      min_max_persons_y.second = std::max(min_max_persons_y.second, y + h);\n\n      person.bbox_ = RectHelper(Rect(x, y, w, h));\n      person.action_id_ = MostCV::UpdateStringIdMap(persons_actions_ids_map, activity_str);\n\n      annot_frame_id_persons_map_[frame_id].push_back(person);\n    }\n\n\n\n    if (min_max_persons_y.first < 0)\n      min_max_persons_y.first = 0;\n\n    annot_frame_id_to_min_max_persons_y_map_[frame_id] = min_max_persons_y;\n    annot_frame_id_vec_.push_back(frame_id);\n\n    string video_id_frame_id = video_id + \"#\"+frame_id;\n\n    global_video_id_frame_id_to_activityId[video_id_frame_id] = frame_activity_id;\n    global_video_id_frame_id_to_persons[video_id_frame_id] = annot_frame_id_persons_map_[frame_id];\n\n    if (annot_frame_id_persons_map_[frame_id].size() < 7)\n    {\n    \tcerr<<\"video \"<<video_id_frame_id<<\" frame id \"<<frame_id\n    \t    <<\" has \"<<annot_frame_id_persons_map_[frame_id].size()<<\" persons\\n\";\n    }\n\n    if (annot_frame_id_persons_map_[frame_id].size() > 12)\n        {\n          cerr<<\"video \"<<video_id_frame_id<<\" frame id \"<<frame_id\n              <<\" has \"<<annot_frame_id_persons_map_[frame_id].size()<<\" persons!!\\n\";\n        }\n  }\n\n  SortPersonsPerFrames();\n\n  cerr << video_id_ << \" is processed\\n\";\n}\n\nvoid VolleyballVideoData::SortPersonsPerFrames() {\n  // Sorting the persons based on top left point: x first, if tie, y first. Kind of left-to-right sweeping\n  for (auto &frame_persons_kv : annot_frame_id_persons_map_) {\n    vector<VolleyballPerson> &persons = frame_persons_kv.second;\n\n    sort(persons.begin(), persons.end(), [](const VolleyballPerson &a, const VolleyballPerson &b)\n    {\n      if(a.bbox_.r.x != b.bbox_.r.x)\n      return a.bbox_.r.x < b.bbox_.r.x;\n      return a.bbox_.r.y < b.bbox_.r.y;\n    });\n  }\n}\n\nvoid VolleyballVideoData::ResetPersons(string img_name, vector<RectHelper> rects) {\n  annot_frame_id_persons_map_[img_name].clear();\n\n  for (auto rect : rects) {\n    VolleyballPerson person;\n\n    person.bbox_ = rect;\n    person.action_id_ = 0;\n\n    annot_frame_id_persons_map_[img_name].push_back(person);\n  }\n}\n\nvector<RectHelper> VolleyballVideoData::GetPersonsRect(string frame_id) {\n  vector<RectHelper> rects;\n\n  for (auto person : annot_frame_id_persons_map_[frame_id])\n    rects.push_back(person.bbox_);\n\n  return rects;\n}\n\n// Short Util\nstring VolleyballVideoData::GetFramePath(string frame_id, int shift) {\n  string frame_id_no_ext = frame_id.substr(0, frame_id.find_first_of('.'));\n  string ext = frame_id.substr(frame_id.find_first_of('.'));\n  string target_frame_id = MostCV::toIntStr(frame_id_no_ext, shift, false);\n  string frame_new_path = video_dir_ + frame_id_no_ext + MostCV::PATH_SEP + target_frame_id + ext;\n\n  assert(boost::filesystem::exists(frame_new_path));\n\n  return frame_new_path;\n}\n\npair<vector<string>, vector<string> > VolleyballVideoData::GetTemporalWindowPaths(string frame_id, int temporal_window, int step, bool is_use_expend_factor) {\n  vector<string> window_frames_after;\n  vector<string> window_frames_before;\n\n  if (is_use_expend_factor)\n    temporal_window = 2 * temporal_window + 1;\n\n  LP(w, 1+temporal_window/2)\n  {\n    string path = GetFramePath(frame_id, -w * step);\n    window_frames_before.push_back(path);\n  }\n\n  LP(w, (temporal_window+1)/2)\n  {\n    string path = GetFramePath(frame_id, w * step);\n    window_frames_after.push_back(path);\n  }\n\n  return {window_frames_before, window_frames_after};\n}\n\nvector<string> VolleyballVideoData::GetTemporalWindowPathsMerged(string frame_id, int temporal_window, int step) {\n  vector<string> paths;\n\n  int start = -temporal_window/2;\n\n  LP(w, temporal_window)\n  {\n    string path = GetFramePath(frame_id, start * step);\n    paths.push_back(path);\n    ++start;\n  }\n  return paths;\n}\n\nvoid VolleyballVideoData::visualize()\n{\n  for (auto frame_id : annot_frame_id_vec_)\n  {\n    string path = GetFramePath(frame_id);\n    Mat img = cv::imread(path);\n\n    cerr<<video_id_<<\" \"<<frame_id<<\" \"<<path<<\"\\n\";\n    RectHelper::DrawRects(img, GetPersonsRect(frame_id));\n  }\n}\n\n\n\n\n\n\n\n\n\n//---------------------------------------------------------------\n\nVolleyballDatasetPart::VolleyballDatasetPart(string dataset_name, string config_file, string videos_root_dir) {\n\n  cerr << \"Preparing Dataset: \" << dataset_name << \"\\n\\tfrom config file: \" << config_file << \"\\n\";\n\n  assert(MostCV::IsPathExist(config_file));\n\n  MostCV::fixDir(videos_root_dir);\n  ids_ = MostCV::readFileItems(config_file, string(\"\"), false);\n  dataset_name_ = dataset_name;\n\n  for (auto video_seq : ids_)\n\t  videos_vec_.push_back(VolleyballVideoData(video_seq, videos_root_dir + video_seq));\n\n  cerr << \"\\n\\n************************\\n\\n\";\n}\n\nvoid VolleyballDatasetPart::ReorderVideos(vector<string> video_ids) {\n  for (int i = 0; i < (int) video_ids.size(); ++i) {\n    for (int j = 0; j < (int) ids_.size(); ++j) {\n      if (video_ids[i] != ids_[j])\n        continue;\n      std::swap(ids_[i], ids_[j]);\n      std::swap(videos_vec_[i], videos_vec_[j]);\n    }\n  }\n}\n\nvector<pair<VolleyballVideoData, int> > VolleyballDatasetPart::GetVideoFrameList(bool is_shuffled, int subset_percent) {\n  vector<pair<VolleyballVideoData, int> > database_shuffled;\n\n  return database_shuffled;\n\n  boost::mt19937 generator(100);\n  boost::uniform_int<> uni_dist;\n  boost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(generator, uni_dist);\n\n  vector<int> labels;\n\n  for (auto video : videos_vec_) {\n    int frame_pos = -1;\n\n    for (auto frame_id : video.annot_frame_id_vec_) {\n      ++frame_pos;\n\n      database_shuffled.push_back(std::make_pair(video, frame_pos));\n    }\n  }\n\n  if (is_shuffled) {\n    cerr << \"Before: Total Shuffled Elements: \" << database_shuffled.size() << \" with 1st video\" << database_shuffled.begin()->first.video_id_ << \"\\n\";\n\n    std::random_shuffle(database_shuffled.begin(), database_shuffled.end(), rand_generator);\n\n    cerr << \"After: Total Shuffled Elements: \" << database_shuffled.size() << \" with 1st video\" << database_shuffled.begin()->first.video_id_ << \"\\n\";\n  }\n\n  int max_size = subset_percent * database_shuffled.size();\n  database_shuffled.resize(max_size);\n\n  return database_shuffled;\n}\n\nvoid VolleyballDatasetPart::visualize()\n{\n  for (auto video : videos_vec_)\n    video.visualize();\n}\n\n\n\n\n\n\n\n\n//---------------------------------------------------------------\n\nVolleyballDatasetMgr::VolleyballDatasetMgr(string config_dir_path, string videos_root_dir) {\n  MostCV::fixDir(config_dir_path);\n\n  dataset_division_.push_back(VolleyballDatasetPart(\"train\", config_dir_path + \"train.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"val\", config_dir_path + \"val.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"test\", config_dir_path + \"test.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"trainval\", config_dir_path + \"trainval.txt\", videos_root_dir));\n\n  total_videos_ = 0;\n  total_frames_ = 0;\n\n  // Remove empty datasets\n  for (int i = 0; i < (int) dataset_division_.size(); ++i) {\n    if (dataset_division_[i].videos_vec_.size() == 0) {\n      cerr << dataset_division_[i].dataset_name_ << \" dataset is EMPTY\\n\";\n\n      dataset_division_.erase(dataset_division_.begin() + i);\n      --i;\n    }\n  }\n\n  assert(dataset_division_.size() > 0);\n\n  for (auto dataset : dataset_division_) {\n    int current_fames = 0;\n\n    for (auto video : dataset.videos_vec_) {\n      total_frames_ += video.annot_frame_id_vec_.size();\n      current_fames += video.annot_frame_id_vec_.size();\n    }\n    cerr << \"Total frames for dataset \" << dataset.dataset_name_ << \" = \" << current_fames << \"\\n\";\n\n    total_videos_ += dataset.videos_vec_.size();\n  }\n\n  total_scene_labels = scene_activities_ids_map.size();\n  total_persons_labels = persons_actions_ids_map.size();\n\n  cerr << \"\\nTotal videos = \" << total_videos_ << \" - total frames = \" << total_frames_ << \"\\n\";\n\n  cerr << \"\\nScenes Labels:\\n\";\n  for (auto scene_kv : scene_activities_ids_map)\n    cerr << \"\\t\" << scene_kv.first << \" \" << scene_kv.second << \"\\n\";\n\n  cerr << \"\\nPersons Labels:\\n\";\n  for (auto persons_kv : persons_actions_ids_map)\n    cerr << \"\\t\" << persons_kv.first << \" \" << persons_kv.second << \"\\n\";\n\n  cerr << \"\\nScenes Labels frequency:\\n\";\n  for (auto entry : scene_activities_freq_map)\n    cerr << \"\\t\" << entry.first << \" \" << entry.second << \"\\n\";\n\n  cerr << \"\\nPlayers Labels frequency:\\n\";\n  for (auto entry : players_activities_freq_map)\n    cerr << \"\\t\" << entry.first << \" \" << entry.second << \"\\n\";\n}\n\nint VolleyballDatasetMgr::GetActivityId(string video_id, string frame_id)\n{\n  string video_id_frame_id = video_id + \"#\"+frame_id;\n\n if (global_video_id_frame_id_to_activityId.count(video_id_frame_id) == 0)\n {\n   cerr<<\"problem with \"<<video_id_frame_id<<\"\\n\\n\";\n   return -1;\n }\n\n  assert( global_video_id_frame_id_to_activityId.count(video_id_frame_id) );\n\n  return global_video_id_frame_id_to_activityId[video_id_frame_id];\n}\n\nvector<VolleyballPerson> VolleyballDatasetMgr::GetPersons(string video_id, string frame_id)\n{\n  string video_id_frame_id = video_id + \"#\"+frame_id;\n\n  assert( global_video_id_frame_id_to_persons.count(video_id_frame_id) );\n\n  return global_video_id_frame_id_to_persons[video_id_frame_id];\n\n}\n\n\n// verify 2*w+1 elements..e.g. centered around every frame\nvoid VolleyballDatasetMgr::VerifyDataAvailbility(int temporal_window)\n{\n  for (auto dataset : dataset_division_) {\n    cerr<<\"Verifying dataset: \"<<dataset.dataset_db_name_<<\"\\n\";\n    for (auto video : dataset.videos_vec_) {\n      for (auto frame_id : video.annot_frame_id_vec_) {\n        video.GetTemporalWindowPaths(frame_id, temporal_window, 1, true);\n      }\n    }\n  }\n}\n\n\n}\n"
  },
  {
    "path": "eclipse-project/ibrahim16-deep-act-rec-part/src/volleyball-dataset-mgr.h",
    "content": "/*\n * coactivity-dataset-mgr.h\n *\n *  Created on: Nov 28, 2015\n *      Author: msibrahi\n */\n\n#ifndef VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_\n#define VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_\n\n#include <string>\n#include <vector>\n#include <set>\nusing std::vector;\nusing std::set;\nusing std::string;\nusing std::pair;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Size;\nusing cv::Ptr;\n\n#include \"rect-helper.h\"\n\nnamespace MostCV {\n\n\nclass VolleyballPerson {\n public:\n  RectHelper  bbox_;\n  int action_id_;\n};\n\nclass VolleyballVideoData {\n public:\n  VolleyballVideoData() {}\n  VolleyballVideoData(string video_id, string video_dir);\n\n  string GetFramePath(string frame_id, int shift = 0);\n  pair< vector<string>, vector<string> > GetTemporalWindowPaths(string frame_id, int temporal_window, int step = 1, bool is_use_expend_factor = true);\n  vector<string> GetTemporalWindowPathsMerged(string frame_id, int temporal_window, int step = 1);\n  void ResetPersons(string frame_id, vector<RectHelper> rects);\n  vector<RectHelper> GetPersonsRect(string frame_id);\n  void SortPersonsPerFrames();\n  void visualize();\n\n\n  string video_id_;\n  string video_dir_;\n\n  vector<string>  annot_frame_id_vec_;\n  map<string, int> annot_frame_id_to_activity_id_map_;\n  map<string, pair<int, int>> annot_frame_id_to_min_max_persons_y_map_;\n\n  map<string, vector<VolleyballPerson> > annot_frame_id_persons_map_;\n};\n\nclass VolleyballDatasetPart {\n public:\n  VolleyballDatasetPart() {}\n  VolleyballDatasetPart(string dataset_name, string config_file, string videos_root_dir);\n  void ReorderVideos(vector<string> video_ids);\n  vector<pair<VolleyballVideoData, int> > GetVideoFrameList(bool is_shuffled, int subset_percent);\n  void visualize();\n\n  vector<string> ids_;\n  vector<VolleyballVideoData> videos_vec_;\n  string dataset_name_;\n\n  string dataset_db_name_;\n  string dataset_db_path_;\n\n};\n\nclass VolleyballDatasetMgr {\n public:\n  VolleyballDatasetMgr(string config_dir_path, string videos_root_dir);\n\n  void VerifyDataAvailbility(int temporal_window);\n\n  int GetActivityId(string video_id, string frame_id);\n  vector<VolleyballPerson> GetPersons(string video_id, string frame_id);\n\n  vector<VolleyballDatasetPart> dataset_division_;\n  int total_videos_;\n  int total_frames_;\n  int total_scene_labels;\n  int total_persons_labels;\n};\n\n}\n\n#endif /* VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_ */\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/clip_w5.txt",
    "content": "examples/deep-activity-rec/ibrahim16-cvpr/none.jpg 0\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/trainval-test-create-mean-script.sh",
    "content": "#!/usr/bin/env sh\n# This script converts the vollyball data into leveldb format.\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p1-network1\n\necho \"Computing image mean for trainval dataset: \" $OUTDIR\n\n./build/tools/compute_image_mean -backend=leveldb $OUTDIR/trainval-leveldb $OUTDIR/mean.binaryproto\n\necho \"Done.\"\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/trainval-test-exe-script-resume.sh",
    "content": "#!/usr/bin/env sh\n\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p1-network1\nGPU_ID=0\nITER=15000\n\necho \"Resuming Caffe using GPU\" $GPU \"In Directory \" $OUTDIR \"Starting from iteration \" $ITER\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log-resume.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt    --snapshot=$OUTDIR/z_snapshot_iter_$ITER.solverstate  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/trainval-test-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p1-network1\nGPU_ID=0\n\necho \"Running Caffe using GPU\" $GPU \"In Directory \" $OUTDIR\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt      --weights    models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel    --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/trainval-test-network.prototxt",
    "content": "name: \"volleyball_game_proto\"\nlayer {\n  name: \"clip_data\"\n  type: \"ImageData\"\n  top: \"dummy\"\n  top: \"clip\"\n\n  image_data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/clip_w5.txt\"\n    batch_size: 250\n  }\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"dummy\"\n}\n\nlayer {\n  name: \"volleyball_game\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TRAIN\n  }\n  transform_param {\n    mirror: true\n    crop_size: 227\n    mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/trainval-leveldb\"\n    batch_size: 250\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"volleyball_game\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/mean.binaryproto\"\n  }\n  data_param {\n source: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/test-leveldb\"\n    batch_size: 250\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  bottom: \"clip\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n"
  },
  {
    "path": "ibrahim16-cvpr/p1-network1/trainval-test-solver.prototxt",
    "content": "net: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/trainval-test-network.prototxt\"\n\n# testing examples are 77655 ~= 250 * 310\ntest_iter: 310\ntest_interval: 15000\ndisplay: 1000\n\nbase_lr: 0.00001\nlr_policy: \"step\"\ngamma: 0.1\nstepsize: 15000\nmax_iter: 15000\nmomentum: 0.9\nweight_decay: 0.0005\n\nrandom_seed: 750301\nsolver_mode: GPU\n\nsnapshot: 5000\nsnapshot_prefix: \"examples/deep-activity-rec/ibrahim16-cvpr/p1-network1/z_snapshot\"\nsnapshot_after_train: true\n\n"
  },
  {
    "path": "ibrahim16-cvpr/p3-extract-features-networks/test.prototxt",
    "content": "name: \"volleyball_proto\"\n\nlayer {\n  name: \"volleyball\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n   \n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr/p2-ready-fuse/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p2-ready-fuse/test-leveldb\"\n    batch_size: 120\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc8_volleyball\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"prop\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr/p3-extract-features-networks/trainval.prototxt",
    "content": "name: \"volleyball_proto\"\n\nlayer {\n  name: \"volleyball\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n   \n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr/p2-ready-fuse/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p2-ready-fuse/trainval-leveldb\"\n    batch_size: 120\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc8_volleyball\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"prop\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/clip_w10.txt",
    "content": "examples/deep-activity-rec/ibrahim16-cvpr/none.jpg 0\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr/none.jpg 1\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-exe-script-resume.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p4-network2\nGPU_ID=0\nITER=10000\n\necho \"Resuming Caffe using GPU\" $GPU \"In Directory \" $OUTDIR \"Starting from iteration \" $ITER\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log-resume.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt --snapshot=$OUTDIR/z_snapshot_iter_$ITER.solverstate  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p4-network2\nGPU_ID=0\n\necho \"Running Caffe using GPU\" $GPU \"In Directory \" $OUTDIR\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-network.prototxt",
    "content": "name: \"volleyball_level2\"\nlayer {\n  name: \"clip_data\"\n  type: \"ImageData\"\n  top: \"dummy\"\n  top: \"clip\"\n\n  image_data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/clip_w10.txt\"\n    batch_size: 250\n  }\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"dummy\"\n}\n\nlayer {\n  name: \"volleyball_level2\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TRAIN\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/trainval-leveldb\"\n    batch_size: 250\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"volleyball_level2\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/test-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\n\nlayer {\n  name: \"fc1\"\n  type: \"InnerProduct\"\n  bottom: \"data\"\n  top: \"fc1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 3000\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc1\"\n  bottom: \"clip\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 1000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc_last\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc_last\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 8\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n}\n\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\n\n\n\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-solver.prototxt",
    "content": "net: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/trainval-test-network.prototxt\"\n\n# testing examples are 13370 = 1337 * 10.\ntest_iter: 1337\ntest_interval: 2000\ndisplay: 2000\n\nbase_lr: 0.0001\nlr_policy: \"step\"\ngamma: 0.1\nstepsize: 10000\nmax_iter: 20000\nmomentum: 0.9\nweight_decay: 0.0005\n\nrandom_seed: 750301\nsolver_mode: GPU\n\nsnapshot: 5000\nsnapshot_prefix: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/z_snapshot\"\nsnapshot_after_train: true\n\n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-window-evaluation-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr/p4-network2\nWINDOW=10\nGPU_ID=0\n\nTEST_EXAMPLES=1337\nITER=20000\nLAYER=prop\n\n\nexamples/deep-activity-rec/exePhase4  \\\n $WINDOW \\\n GPU $GPU_ID \\\n $OUTDIR/z_snapshot_iter_$ITER.caffemodel \\\n $OUTDIR/trainval-test-window-evaluation-network.prototxt \\\n $LAYER \\\n $TEST_EXAMPLES \\\n 2>&1 |  tee   $OUTDIR/z_trainval-test-window-evaluation-log-prop.txt \n"
  },
  {
    "path": "ibrahim16-cvpr/p4-network2/trainval-test-window-evaluation-network.prototxt",
    "content": "name: \"volleyball_level2\"\n\nlayer {\n  name: \"volleyball_data\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr/p4-network2/test-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\n\nlayer {\n  name: \"fc1\"\n  type: \"InnerProduct\"\n  bottom: \"data\"\n  top: \"fc1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 3000\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc1\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 1000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc_last\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc_last\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 8\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n}\n\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\n\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc_last\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"argmax\"\n  type: \"ArgMax\"\n  argmax_param {\n    out_max_val: false\n    top_k: 1\n  }\n  bottom: \"prop\"\n  top: \"argmax\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr/script-clean.sh",
    "content": "#!/usr/bin/env bash\n\nROOT_DIR=examples/deep-activity-rec/ibrahim16-cvpr\n\n# Phase 1 artifacts\nrm -r $ROOT_DIR/p1-network1/test-leveldb\nrm -r $ROOT_DIR/p1-network1/trainval-leveldb\nrm $ROOT_DIR/p1-network1/mean.binaryproto\nrm $ROOT_DIR/p1-network1/z_log_dataset_net1.txt\nrm $ROOT_DIR/p1-network1/z_trainval-test-log.txt\nrm $ROOT_DIR/p1-network1/z_snapshot_iter_*.caffemodel\nrm $ROOT_DIR/p1-network1/z_snapshot_iter_*.solverstate\n\n# Phasse 2\nrm -r $ROOT_DIR/p2-ready-fuse\n\n# Phasse 3 & 4\nrm -r $ROOT_DIR/p4-network2/test-leveldb\nrm -r $ROOT_DIR/p4-network2/trainval-leveldb\nrm $ROOT_DIR/p4-network2/z_log_dataset_net2.txt\nrm $ROOT_DIR/p4-network2/z_trainval-test-log.txt\nrm $ROOT_DIR/p4-network2/z_trainval-test-window-evaluation-log-prop.txt\nrm $ROOT_DIR/p4-network2/z_snapshot_iter_*.caffemodel\nrm $ROOT_DIR/p4-network2/z_snapshot_iter_*.solverstate\n"
  },
  {
    "path": "ibrahim16-cvpr/script-p1-data.sh",
    "content": "#!/usr/bin/env sh\n\nCAFFE=/cs/vml2/msibrahi/workspaces/caffe-lstm\n\nGIT_PROJ_DIR=$CAFFE/examples/deep-activity-rec\nDATASET_VIDEOS=/cs/vml2/msibrahi/Datasets/Greg-Volleyball/volleyball\nDATASET_CONFIG=$GIT_PROJ_DIR/dataset-config\nOUTPUT_DIR=$GIT_PROJ_DIR/ibrahim16-cvpr\n\nTRAIN_SRC=trainval\nTEST_SRC=test\n\nWINDOW_NETWORK1=5\nWINDOW_NETWORK2=10\nSTEP=1\n\nGPU_ID=0\nNETWORK1_HIDDEN=3000\nNETWORK1_TRAIN_ITERS=15000\n\n# Fusion Styles: Choose 0-7\n# 0 => Conc / 1 group       1 => Max / 1 group        4 => Avg / 1 group        7 => sum / 1 group\n# 2 => Max / 2 groups\t    5 => Avg / 2 groups       3 => Max / 4 groups       6 => Avg / 4 groups    \nFUSION_STYLE=2\nFUSION_TRAIN_ITER=3493\nFUSION_TEST_ITER=1337\n\nVAR_FUSION_LAYERS_VAL=\"2 fc7 lstm1\"\nVAR_FUSION_LAYERS=\"FUSION_LAYERS\"\ndeclare \"$VAR_FUSION_LAYERS=$VAR_FUSION_LAYERS_VAL\"\n\nNETWORK1_DIR=$OUTPUT_DIR/p1-network1\nNETWORK1_MODEL_PATH=$NETWORK1_DIR/z_snapshot_iter_$NETWORK1_TRAIN_ITERS.caffemodel\nNETWORK2_LEVELDB_FUSION_DIR=$OUTPUT_DIR/p2-ready-fuse\nNETWORK2_EXTRACTION_NETOWRK_DIR=$OUTPUT_DIR/p3-extract-features-networks\nNETWORK2_DIR=$OUTPUT_DIR/p4-network2\n\n# Programs\nEXE_P1_NETWORK1=exePhase1_2\nEXE_P2_FUSE=exePhase1_2\nEXE_P4_NETWORK2=exePhase3\n\n###########################################################################\necho ------------------------------------------------------\necho \necho \"START processing script\" \"$0\"\necho \"OUTPUT Directory is \" $OUTPUT_DIR\necho \necho Doing path VALIDATIONS\n\n## Some directories / files validation\n\n[ -d $CAFFE ]             || echo Directory $CAFFE NOOOT exist\n[ -d $OUTPUT_DIR ]        || echo Directory $OUTPUT_DIR NOOOT exist\n[ -d $DATASET_VIDEOS ]    || echo Directory $DATASET_VIDEOS NOOOT exist\n[ -d $DATASET_CONFIG ]    || echo Directory $DATASET_CONFIG NOOOT exist\n[ -d $NETWORK1_DIR ]      || echo Directory $NETWORK1_DIR NOOOT exist\n[ -d $NETWORK2_EXTRACTION_NETOWRK_DIR ] || echo Directory $NETWORK2_EXTRACTION_NETOWRK_DIR NOOOT exist\n[ -d $NETWORK2_DIR ]      || echo Directory $NETWORK2_DIR NOOOT exist\n\necho READY...STEADY...Gooo ?\nread -t 60\n\ncd $CAFFE\n\n\n\n\n\n\n\n\n\n\n\n###########################################################################\necho ------------------------------------------------------\necho Phase 1 - Generating Network 1 Data - $NETWORK1_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P1_NETWORK1  \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK1_DIR       $WINDOW_NETWORK1    $STEP  1 \\\n  2>&1 |  tee    \\\n  $NETWORK1_DIR/z_log_dataset_net1.txt   \n\n\necho ========================\n\necho Phase 1 - B - Computing Mean of Network 1 Training Data - $NETWORK1_DIR\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-create-mean-script.sh\n\nread -t 10\n\n\n\n\n\n"
  },
  {
    "path": "ibrahim16-cvpr/script-p1-train-p3-p4.sh",
    "content": "#!/usr/bin/env sh\n\nCAFFE=/cs/vml2/msibrahi/workspaces/caffe-lstm\n\nGIT_PROJ_DIR=$CAFFE/examples/deep-activity-rec\nDATASET_VIDEOS=/cs/vml2/msibrahi/Datasets/Greg-Volleyball/volleyball\nDATASET_CONFIG=$GIT_PROJ_DIR/dataset-config\nOUTPUT_DIR=$GIT_PROJ_DIR/ibrahim16-cvpr\n\nTRAIN_SRC=trainval\nTEST_SRC=test\n\nWINDOW_NETWORK1=5\nWINDOW_NETWORK2=10\nSTEP=1\n\nGPU_ID=0\nNETWORK1_HIDDEN=3000\nNETWORK1_TRAIN_ITERS=15000\n\n# Fusion Styles: Choose 0-7\n# 0 => Conc / 1 group       1 => Max / 1 group        4 => Avg / 1 group        7 => sum / 1 group\n# 2 => Max / 2 groups\t    5 => Avg / 2 groups       3 => Max / 4 groups       6 => Avg / 4 groups    \nFUSION_STYLE=2\nFUSION_TRAIN_ITER=3493\nFUSION_TEST_ITER=1337\n\nVAR_FUSION_LAYERS_VAL=\"2 fc7 lstm1\"\nVAR_FUSION_LAYERS=\"FUSION_LAYERS\"\ndeclare \"$VAR_FUSION_LAYERS=$VAR_FUSION_LAYERS_VAL\"\n\nNETWORK1_DIR=$OUTPUT_DIR/p1-network1\nNETWORK1_MODEL_PATH=$NETWORK1_DIR/z_snapshot_iter_$NETWORK1_TRAIN_ITERS.caffemodel\nNETWORK2_LEVELDB_FUSION_DIR=$OUTPUT_DIR/p2-ready-fuse\nNETWORK2_EXTRACTION_NETOWRK_DIR=$OUTPUT_DIR/p3-extract-features-networks\nNETWORK2_DIR=$OUTPUT_DIR/p4-network2\n\n# Programs\nEXE_P1_NETWORK1=exePhase1_2\nEXE_P2_FUSE=exePhase1_2\nEXE_P4_NETWORK2=exePhase3\n\n###########################################################################\necho ------------------------------------------------------\necho \necho \"START processing script\" \"$0\"\necho \"OUTPUT Directory is \" $OUTPUT_DIR\necho \necho Doing path VALIDATIONS\n\n## Some directories / files validation\n\n[ -d $CAFFE ]             || echo Directory $CAFFE NOOOT exist\n[ -d $OUTPUT_DIR ]        || echo Directory $OUTPUT_DIR NOOOT exist\n[ -d $DATASET_VIDEOS ]    || echo Directory $DATASET_VIDEOS NOOOT exist\n[ -d $DATASET_CONFIG ]    || echo Directory $DATASET_CONFIG NOOOT exist\n[ -d $NETWORK1_DIR ]      || echo Directory $NETWORK1_DIR NOOOT exist\n[ -d $NETWORK2_EXTRACTION_NETOWRK_DIR ] || echo Directory $NETWORK2_EXTRACTION_NETOWRK_DIR NOOOT exist\n[ -d $NETWORK2_DIR ]      || echo Directory $NETWORK2_DIR NOOOT exist\n\necho READY...STEADY...Gooo ?\nread -t 60\n\ncd $CAFFE\n\n\n\n\n\n\n\n\n\n\n\n###########################################################################\necho ========================\n\n\necho Phase 1 - C - Network 1 Training\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\nread -t 10\n\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n# Info: # iterations (e.g. 10863 = 17 * 639. 639 is the # of test cases. \n# Info: Inside the prototxt, a batch # equal to # of persons (e.g. 5). 17 = 2 * 8 +1. 8 is the right temporal width\n\necho Phase 3 - Generarting LSTM 2 Data - $NETWORK2_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P4_NETWORK2     \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID     \\\n   $NETWORK1_MODEL_PATH   \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TRAIN_SRC.prototxt   \\\n   $FUSION_LAYERS     \\\n   $NETWORK2_DIR/$TRAIN_SRC-leveldb   \\\n   $FUSION_TRAIN_ITER   \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID    \\\n   $NETWORK1_MODEL_PATH    \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TEST_SRC.prototxt    \\\n   $FUSION_LAYERS         \\\n   $NETWORK2_DIR/$TEST_SRC-leveldb    \\\n   $FUSION_TEST_ITER   \\\n   2>&1 |  tee    \\\n   $NETWORK2_DIR/z_log_dataset_net2.txt\n\nread -t 10\n\n\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n\necho Phase 4 - A - LSTM 2 Training - $NETWORK2_DIR\n\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\n\n\n\n\necho ========================\n\n\necho Phase 4 - B - Temporal Evaluation\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-window-evaluation-exe-script.sh\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho \necho DONE processing script \"$0\"\necho \necho ------------------------------------------------------\n\n"
  },
  {
    "path": "ibrahim16-cvpr/script-p2-data-fuse.sh",
    "content": "#!/usr/bin/env sh\n\nCAFFE=/cs/vml2/msibrahi/workspaces/caffe-lstm\n\nGIT_PROJ_DIR=$CAFFE/examples/deep-activity-rec\nDATASET_VIDEOS=/cs/vml2/msibrahi/Datasets/Greg-Volleyball/volleyball\nDATASET_CONFIG=$GIT_PROJ_DIR/dataset-config\nOUTPUT_DIR=$GIT_PROJ_DIR/ibrahim16-cvpr\n\nTRAIN_SRC=trainval\nTEST_SRC=test\n\nWINDOW_NETWORK1=5\nWINDOW_NETWORK2=10\nSTEP=1\n\nGPU_ID=0\nNETWORK1_HIDDEN=3000\nNETWORK1_TRAIN_ITERS=15000\n\n# Fusion Styles: Choose 0-7\n# 0 => Conc / 1 group       1 => Max / 1 group        4 => Avg / 1 group        7 => sum / 1 group\n# 2 => Max / 2 groups\t    5 => Avg / 2 groups       3 => Max / 4 groups       6 => Avg / 4 groups    \nFUSION_STYLE=2\nFUSION_TRAIN_ITER=3493\nFUSION_TEST_ITER=1337\n\nVAR_FUSION_LAYERS_VAL=\"2 fc7 lstm1\"\nVAR_FUSION_LAYERS=\"FUSION_LAYERS\"\ndeclare \"$VAR_FUSION_LAYERS=$VAR_FUSION_LAYERS_VAL\"\n\nNETWORK1_DIR=$OUTPUT_DIR/p1-network1\nNETWORK1_MODEL_PATH=$NETWORK1_DIR/z_snapshot_iter_$NETWORK1_TRAIN_ITERS.caffemodel\nNETWORK2_LEVELDB_FUSION_DIR=$OUTPUT_DIR/p2-ready-fuse\nNETWORK2_EXTRACTION_NETOWRK_DIR=$OUTPUT_DIR/p3-extract-features-networks\nNETWORK2_DIR=$OUTPUT_DIR/p4-network2\n\n# Programs\nEXE_P1_NETWORK1=exePhase1_2\nEXE_P2_FUSE=exePhase1_2\nEXE_P4_NETWORK2=exePhase3\n\n###########################################################################\necho ------------------------------------------------------\necho \necho \"START processing script\" \"$0\"\necho \"OUTPUT Directory is \" $OUTPUT_DIR\necho \necho Doing path VALIDATIONS\n\n## Some directories / files validation\n\n[ -d $CAFFE ]             || echo Directory $CAFFE NOOOT exist\n[ -d $OUTPUT_DIR ]        || echo Directory $OUTPUT_DIR NOOOT exist\n[ -d $DATASET_VIDEOS ]    || echo Directory $DATASET_VIDEOS NOOOT exist\n[ -d $DATASET_CONFIG ]    || echo Directory $DATASET_CONFIG NOOOT exist\n[ -d $NETWORK1_DIR ]      || echo Directory $NETWORK1_DIR NOOOT exist\n[ -d $NETWORK2_EXTRACTION_NETOWRK_DIR ] || echo Directory $NETWORK2_EXTRACTION_NETOWRK_DIR NOOOT exist\n[ -d $NETWORK2_DIR ]      || echo Directory $NETWORK2_DIR NOOOT exist\n\necho READY...STEADY...Gooo ?\nread -t 60\n\ncd $CAFFE\n\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho Phase 2 - A - Generating Data to be Fused - $NETWORK2_LEVELDB_FUSION_DIR\n\nmkdir -p $NETWORK2_LEVELDB_FUSION_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P2_FUSE   \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK2_LEVELDB_FUSION_DIR       $WINDOW_NETWORK2    $STEP  0  \\\n  2>&1 |  tee    \\\n  $NETWORK2_LEVELDB_FUSION_DIR/z_log_dataset_fuse.txt   \n\necho ========================\n\n\n\necho Phase 2 - B - Creating Mean File of Fused Data\necho \"Computing image mean for dataset: \" $TRAIN_SRC\n\n$CAFFE/build/tools/compute_image_mean -backend=leveldb  $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb $NETWORK2_LEVELDB_FUSION_DIR/mean.binaryproto\n\n\nread -t 10\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho \necho DONE processing script \"$0\"\necho \necho ------------------------------------------------------\n\n"
  },
  {
    "path": "ibrahim16-cvpr/script.sh",
    "content": "#!/usr/bin/env sh\n\nCAFFE=/cs/vml2/msibrahi/workspaces/caffe-lstm\n\nGIT_PROJ_DIR=$CAFFE/examples/deep-activity-rec\nDATASET_VIDEOS=/cs/vml2/msibrahi/Datasets/Greg-Volleyball/volleyballVolleyball\nDATASET_CONFIG=$GIT_PROJ_DIR/dataset-config\nOUTPUT_DIR=$GIT_PROJ_DIR/ibrahim16-cvpr\n\nTRAIN_SRC=trainval\nTEST_SRC=test\n\nWINDOW_NETWORK1=5\nWINDOW_NETWORK2=10\nSTEP=1\n\nGPU_ID=0\nNETWORK1_HIDDEN=3000\nNETWORK1_TRAIN_ITERS=15000\n\n# Fusion Styles: Choose 0-7\n# 0 => Conc / 1 group       1 => Max / 1 group        4 => Avg / 1 group        7 => sum / 1 group\n# 2 => Max / 2 groups\t    5 => Avg / 2 groups       3 => Max / 4 groups       6 => Avg / 4 groups    \nFUSION_STYLE=2\nFUSION_TRAIN_ITER=3493\nFUSION_TEST_ITER=1337\n\nVAR_FUSION_LAYERS_VAL=\"2 fc7 lstm1\"\nVAR_FUSION_LAYERS=\"FUSION_LAYERS\"\ndeclare \"$VAR_FUSION_LAYERS=$VAR_FUSION_LAYERS_VAL\"\n\nNETWORK1_DIR=$OUTPUT_DIR/p1-network1\nNETWORK1_MODEL_PATH=$NETWORK1_DIR/z_snapshot_iter_$NETWORK1_TRAIN_ITERS.caffemodel\nNETWORK2_LEVELDB_FUSION_DIR=$OUTPUT_DIR/p2-ready-fuse\nNETWORK2_EXTRACTION_NETOWRK_DIR=$OUTPUT_DIR/p3-extract-features-networks\nNETWORK2_DIR=$OUTPUT_DIR/p4-network2\n\n# Programs\nEXE_P1_NETWORK1=exePhase1_2\nEXE_P2_FUSE=exePhase1_2\nEXE_P4_NETWORK2=exePhase3\n\n###########################################################################\necho ------------------------------------------------------\necho \necho \"START processing script\" \"$0\"\necho \"OUTPUT Directory is \" $OUTPUT_DIR\necho \necho Doing path VALIDATIONS\n\n## Some directories / files validation\n\n[ -d $CAFFE ]             || echo Directory $CAFFE NOOOT exist\n[ -d $OUTPUT_DIR ]        || echo Directory $OUTPUT_DIR NOOOT exist\n[ -d $DATASET_VIDEOS ]    || echo Directory $DATASET_VIDEOS NOOOT exist\n[ -d $DATASET_CONFIG ]    || echo Directory $DATASET_CONFIG NOOOT exist\n[ -d $NETWORK1_DIR ]      || echo Directory $NETWORK1_DIR NOOOT exist\n[ -d $NETWORK2_EXTRACTION_NETOWRK_DIR ] || echo Directory $NETWORK2_EXTRACTION_NETOWRK_DIR NOOOT exist\n[ -d $NETWORK2_DIR ]      || echo Directory $NETWORK2_DIR NOOOT exist\n\necho READY...STEADY...Gooo ?\nread -t 60\n\ncd $CAFFE\n\n\n\n\n\n\n\n\n\n\n\n###########################################################################\necho ------------------------------------------------------\necho Phase 1 - Generating Network 1 Data - $NETWORK1_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P1_NETWORK1  \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK1_DIR       $WINDOW_NETWORK1    $STEP  1 \\\n  2>&1 |  tee    \\\n  $NETWORK1_DIR/z_log_dataset_net1.txt   \n\n\necho ========================\n\necho Phase 1 - B - Computing Mean of Network 1 Training Data - $NETWORK1_DIR\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-create-mean-script.sh\n\nread -t 10\n\n\n\n\n\n\n\necho ========================\n\n\necho Phase 1 - C - Network 1 Training\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\nread -t 10\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho Phase 2 - A - Generating Data to be Fused - $NETWORK2_LEVELDB_FUSION_DIR\n\nmkdir -p $NETWORK2_LEVELDB_FUSION_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P2_FUSE   \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK2_LEVELDB_FUSION_DIR       $WINDOW_NETWORK2    $STEP  0  \\\n  2>&1 |  tee    \\\n  $NETWORK2_LEVELDB_FUSION_DIR/z_log_dataset_fuse.txt   \n\necho ========================\n\n\n\necho Phase 2 - B - Creating Mean File of Fused Data\necho \"Computing image mean for dataset: \" $TRAIN_SRC\n\n$CAFFE/build/tools/compute_image_mean -backend=leveldb  $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb $NETWORK2_LEVELDB_FUSION_DIR/mean.binaryproto\n\n\nread -t 10\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n# Info: # iterations (e.g. 10863 = 17 * 639. 639 is the # of test cases. \n# Info: Inside the prototxt, a batch # equal to # of persons (e.g. 5). 17 = 2 * 8 +1. 8 is the right temporal width\n\necho Phase 3 - Generarting LSTM 2 Data - $NETWORK2_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P4_NETWORK2     \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID     \\\n   $NETWORK1_MODEL_PATH   \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TRAIN_SRC.prototxt   \\\n   $FUSION_LAYERS     \\\n   $NETWORK2_DIR/$TRAIN_SRC-leveldb   \\\n   $FUSION_TRAIN_ITER   \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID    \\\n   $NETWORK1_MODEL_PATH    \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TEST_SRC.prototxt    \\\n   $FUSION_LAYERS         \\\n   $NETWORK2_DIR/$TEST_SRC-leveldb    \\\n   $FUSION_TEST_ITER   \\\n   2>&1 |  tee    \\\n   $NETWORK2_DIR/z_log_dataset_net2.txt\n\nread -t 10\n\n\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n\necho Phase 4 - A - LSTM 2 Training - $NETWORK2_DIR\n\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\n\n\n\n\necho ========================\n\n\necho Phase 4 - B - Temporal Evaluation\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-window-evaluation-exe-script.sh\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho \necho DONE processing script \"$0\"\necho \necho ------------------------------------------------------\n\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/clip_w5.txt",
    "content": "examples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 0\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/trainval-test-create-mean-script.sh",
    "content": "#!/usr/bin/env sh\n# This script converts the vollyball data into leveldb format.\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\n\necho \"Computing image mean for trainval dataset: \" $OUTDIR\n\n./build/tools/compute_image_mean -backend=leveldb $OUTDIR/trainval-leveldb $OUTDIR/mean.binaryproto\n\necho \"Done.\"\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/trainval-test-exe-script-resume.sh",
    "content": "#!/usr/bin/env sh\n\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\nGPU_ID=0\nITER=15000\n\necho \"Resuming Caffe using GPU\" $GPU \"In Directory \" $OUTDIR \"Starting from iteration \" $ITER\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log-resume.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt    --snapshot=$OUTDIR/z_snapshot_iter_$ITER.solverstate  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/trainval-test-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\nGPU_ID=0\n\necho \"Running Caffe using GPU\" $GPU \"In Directory \" $OUTDIR\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt      --weights    models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel    --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/trainval-test-network.prototxt",
    "content": "name: \"volleyball_game_proto\"\nlayer {\n  name: \"clip_data\"\n  type: \"ImageData\"\n  top: \"dummy\"\n  top: \"clip\"\n\n  image_data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/clip_w5.txt\"\n    batch_size: 250\n  }\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"dummy\"\n}\n\nlayer {\n  name: \"volleyball_game\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TRAIN\n  }\n  transform_param {\n    mirror: true\n    crop_size: 227\n    mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"volleyball_game\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/mean.binaryproto\"\n  }\n  data_param {\n source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/test-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  bottom: \"clip\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p1-network1/trainval-test-solver.prototxt",
    "content": "net: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-test-network.prototxt\"\n\n# testing examples are 77655 ~= 250 * 310\ntest_iter: 1\ntest_interval: 1\ndisplay: 1\n\nbase_lr: 0.00001\nlr_policy: \"step\"\ngamma: 0.1\nstepsize: 1\nmax_iter: 1\nmomentum: 0.9\nweight_decay: 0.0005\n\nrandom_seed: 750301\nsolver_mode: GPU\n\nsnapshot: 1\nsnapshot_prefix: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/z_snapshot\"\nsnapshot_after_train: true\n\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p3-extract-features-networks/test.prototxt",
    "content": "name: \"volleyball_proto\"\n\nlayer {\n  name: \"volleyball\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n   \n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/test-leveldb\"\n    batch_size: 120\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc8_volleyball\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"prop\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p3-extract-features-networks/trainval.prototxt",
    "content": "name: \"volleyball_proto\"\n\nlayer {\n  name: \"volleyball\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  transform_param {\n    mirror: false\n    crop_size: 227\n   \n     mean_file: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/mean.binaryproto\"\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/trainval-leveldb\"\n    batch_size: 120\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"conv1\"\n  type: \"Convolution\"\n  bottom: \"data\"\n  top: \"conv1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 96\n    kernel_size: 11\n    stride: 4\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu1\"\n  type: \"ReLU\"\n  bottom: \"conv1\"\n  top: \"conv1\"\n}\nlayer {\n  name: \"pool1\"\n  type: \"Pooling\"\n  bottom: \"conv1\"\n  top: \"pool1\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm1\"\n  type: \"LRN\"\n  bottom: \"pool1\"\n  top: \"norm1\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv2\"\n  type: \"Convolution\"\n  bottom: \"norm1\"\n  top: \"conv2\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 2\n    kernel_size: 5\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu2\"\n  type: \"ReLU\"\n  bottom: \"conv2\"\n  top: \"conv2\"\n}\nlayer {\n  name: \"pool2\"\n  type: \"Pooling\"\n  bottom: \"conv2\"\n  top: \"pool2\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"norm2\"\n  type: \"LRN\"\n  bottom: \"pool2\"\n  top: \"norm2\"\n  lrn_param {\n    local_size: 5\n    alpha: 0.0001\n    beta: 0.75\n  }\n}\nlayer {\n  name: \"conv3\"\n  type: \"Convolution\"\n  bottom: \"norm2\"\n  top: \"conv3\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"relu3\"\n  type: \"ReLU\"\n  bottom: \"conv3\"\n  top: \"conv3\"\n}\nlayer {\n  name: \"conv4\"\n  type: \"Convolution\"\n  bottom: \"conv3\"\n  top: \"conv4\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 384\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu4\"\n  type: \"ReLU\"\n  bottom: \"conv4\"\n  top: \"conv4\"\n}\nlayer {\n  name: \"conv5\"\n  type: \"Convolution\"\n  bottom: \"conv4\"\n  top: \"conv5\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  convolution_param {\n    num_output: 256\n    pad: 1\n    kernel_size: 3\n    group: 2\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu5\"\n  type: \"ReLU\"\n  bottom: \"conv5\"\n  top: \"conv5\"\n}\nlayer {\n  name: \"pool5\"\n  type: \"Pooling\"\n  bottom: \"conv5\"\n  top: \"pool5\"\n  pooling_param {\n    pool: MAX\n    kernel_size: 3\n    stride: 2\n  }\n}\nlayer {\n  name: \"fc6\"\n  type: \"InnerProduct\"\n  bottom: \"pool5\"\n  top: \"fc6\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu6\"\n  type: \"ReLU\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n}\nlayer {\n  name: \"drop6\"\n  type: \"Dropout\"\n  bottom: \"fc6\"\n  top: \"fc6\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\nlayer {\n  name: \"fc7\"\n  type: \"InnerProduct\"\n  bottom: \"fc6\"\n  top: \"fc7\"\n  # Note that lr_mult can be set to 0 to disable any fine-tuning of this, and any other, layer\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 4096\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.005\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 1\n    }\n  }\n}\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc7\"\n  top: \"fc7\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc7\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 3000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc8_volleyball\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc8_volleyball\"\n  # lr_mult is set to higher than for other layers, because this layer is starting from random while the others are already trained\n  param {\n    lr_mult: 10\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 20\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 9\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n}\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc8_volleyball\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc8_volleyball\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"prop\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/clip_w10.txt",
    "content": "examples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 0\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\nexamples/deep-activity-rec/ibrahim16-cvpr-simple/none.jpg 1\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-exe-script-resume.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\nGPU_ID=0\nITER=10000\n\necho \"Resuming Caffe using GPU\" $GPU \"In Directory \" $OUTDIR \"Starting from iteration \" $ITER\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log-resume.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt --snapshot=$OUTDIR/z_snapshot_iter_$ITER.solverstate  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\nGPU_ID=0\n\necho \"Running Caffe using GPU\" $GPU \"In Directory \" $OUTDIR\n\n./build/tools/caffe train 2> $OUTDIR/z_trainval-test-log.txt \\\n  --solver $OUTDIR/trainval-test-solver.prototxt  --gpu $GPU_ID\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-network.prototxt",
    "content": "name: \"volleyball_level2\"\nlayer {\n  name: \"clip_data\"\n  type: \"ImageData\"\n  top: \"dummy\"\n  top: \"clip\"\n\n  image_data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/clip_w10.txt\"\n    batch_size: 10\n  }\n}\n\nlayer {\n  name: \"Silence\"\n  type: \"Silence\"\n  bottom: \"dummy\"\n}\n\nlayer {\n  name: \"volleyball_level2\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TRAIN\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\nlayer {\n  name: \"volleyball_level2\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/test-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\n\nlayer {\n  name: \"fc1\"\n  type: \"InnerProduct\"\n  bottom: \"data\"\n  top: \"fc1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 3000\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc1\"\n  bottom: \"clip\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 1000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc_last\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc_last\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 8\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n}\n\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\n\n\n\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-solver.prototxt",
    "content": "net: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-test-network.prototxt\"\n\n# testing examples are 13370 = 1337 * 10.\ntest_iter: 2\ntest_interval: 2\ndisplay: 2\n\nbase_lr: 0.0001\nlr_policy: \"step\"\ngamma: 0.1\nstepsize: 2\nmax_iter: 2\nmomentum: 0.9\nweight_decay: 0.0005\n\nrandom_seed: 750301\nsolver_mode: GPU\n\nsnapshot: 2\nsnapshot_prefix: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/z_snapshot\"\nsnapshot_after_train: true\n\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-window-evaluation-exe-script.sh",
    "content": "#!/usr/bin/env sh\n\nOUTDIR=examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\nWINDOW=10\nGPU_ID=0\n\nTEST_EXAMPLES=7\nITER=2\nLAYER=prop\n\n\nexamples/deep-activity-rec/exePhase4  \\\n $WINDOW \\\n GPU $GPU_ID \\\n $OUTDIR/z_snapshot_iter_$ITER.caffemodel \\\n $OUTDIR/trainval-test-window-evaluation-network.prototxt \\\n $LAYER \\\n $TEST_EXAMPLES \\\n 2>&1 |  tee   $OUTDIR/z_trainval-test-window-evaluation-log-prop.txt \n"
  },
  {
    "path": "ibrahim16-cvpr-simple/p4-network2/trainval-test-window-evaluation-network.prototxt",
    "content": "name: \"volleyball_level2\"\n\nlayer {\n  name: \"volleyball_data\"\n  type: \"Data\"\n  top: \"data\"\n  top: \"label\"\n  include {\n    phase: TEST\n  }\n  data_param {\n    source: \"examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/test-leveldb\"\n    batch_size: 10\n    backend: LEVELDB\n  }\n}\n\nlayer {\n  name: \"fc1\"\n  type: \"InnerProduct\"\n  bottom: \"data\"\n  top: \"fc1\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 3000\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"relu7\"\n  type: \"ReLU\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n}\nlayer {\n  name: \"drop7\"\n  type: \"Dropout\"\n  bottom: \"fc1\"\n  top: \"fc1\"\n  dropout_param {\n    dropout_ratio: 0.5\n  }\n}\n\nlayer {\n  name: \"lstm1\"\n  type: \"Lstm\"\n  bottom: \"fc1\"\n  top: \"lstm1\"\n\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n\n  lstm_param {\n    num_output: 1000\n    clipping_threshold: 0.1\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.1\n    }\n    bias_filler {\n      type: \"constant\"\n    }\n  }\n}\n\nlayer {\n  name: \"fc_last\"\n  type: \"InnerProduct\"\n  bottom: \"lstm1\"\n  top: \"fc_last\"\n  param {\n    lr_mult: 1\n    decay_mult: 1\n  }\n  param {\n    lr_mult: 2\n    decay_mult: 0\n  }\n  inner_product_param {\n    num_output: 8\n    weight_filler {\n      type: \"gaussian\"\n      std: 0.01\n    }\n    bias_filler {\n      type: \"constant\"\n      value: 0\n    }\n  }\n}\n\nlayer {\n  name: \"loss\"\n  type: \"SoftmaxWithLoss\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n}\n\nlayer {\n  name: \"accuracy\"\n  type: \"Accuracy\"\n  bottom: \"fc_last\"\n  bottom: \"label\"\n  top: \"accuracy\"\n  include {\n    phase: TEST\n  }\n}\n\n\n\n\n\n\n\nlayer {\n  name: \"prop\"\n  type: \"Softmax\"\n  bottom: \"fc_last\"\n  top: \"prop\"\n}\n\nlayer {\n  name: \"argmax\"\n  type: \"ArgMax\"\n  argmax_param {\n    out_max_val: false\n    top_k: 1\n  }\n  bottom: \"prop\"\n  top: \"argmax\"\n}\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/script-clean.sh",
    "content": "#!/usr/bin/env bash\n\nROOT_DIR=examples/deep-activity-rec/ibrahim16-cvpr-simple\n\n# Phase 1 artifacts\nrm -r $ROOT_DIR/p1-network1/test-leveldb\nrm -r $ROOT_DIR/p1-network1/trainval-leveldb\nrm $ROOT_DIR/p1-network1/mean.binaryproto\nrm $ROOT_DIR/p1-network1/z_log_dataset_net1.txt\nrm $ROOT_DIR/p1-network1/z_trainval-test-log.txt\nrm $ROOT_DIR/p1-network1/z_snapshot_iter_*.caffemodel\nrm $ROOT_DIR/p1-network1/z_snapshot_iter_*.solverstate\n\n# Phasse 2\nrm -r $ROOT_DIR/p2-ready-fuse\n\n# Phasse 3 & 4\nrm -r $ROOT_DIR/p4-network2/test-leveldb\nrm -r $ROOT_DIR/p4-network2/trainval-leveldb\nrm $ROOT_DIR/p4-network2/z_log_dataset_net2.txt\nrm $ROOT_DIR/p4-network2/z_trainval-test-log.txt\nrm $ROOT_DIR/p4-network2/z_trainval-test-window-evaluation-log-prop.txt\nrm $ROOT_DIR/p4-network2/z_snapshot_iter_*.caffemodel\nrm $ROOT_DIR/p4-network2/z_snapshot_iter_*.solverstate\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/script-simple-expected-log.txt",
    "content": "mostafa@mostafa:~/workspaces/git/caffe-lstm$ examples/deep-activity-rec/ibrahim16-cvpr-simple/script-simple.sh\n------------------------------------------------------\n\nSTART processing script examples/deep-activity-rec/ibrahim16-cvpr-simple/script-simple.sh\nOUTPUT Directory is  /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple\n\nDoing path VALIDATIONS\nREADY...STEADY...Gooo ?\n\n------------------------------------------------------\nPhase 1 - Generating Network 1 Data - /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\nStart: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/exePhase1_2\nLSTM 1 preparation\nLoading the dataset...\nPreparing Dataset: train\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/train.txt\n\n\n************************\n\nPreparing Dataset: val\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/val.txt\n\n\n************************\n\nPreparing Dataset: test\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/test.txt\n41 is processed\n\n\n************************\n\nPreparing Dataset: trainval\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/trainval.txt\n39 is processed\n\n\n************************\n\ntrain dataset is EMPTY\nval dataset is EMPTY\nTotal frames for dataset test = 2\nTotal frames for dataset trainval = 2\n\nTotal videos = 2 - total frames = 4\n\nScenes Labels:\n\tl-pass 0\n\tr_spike 1\n\nPersons Labels:\n\tblocking 5\n\tdigging 1\n\tfalling 7\n\tmoving 2\n\tsetting 3\n\tspiking 6\n\tstanding 0\n\twaiting 4\n\nScenes Labels frequency:\n\tl-pass 2\n\tr_spike 2\n\nPlayers Labels frequency:\n\tblocking 1\n\tdigging 2\n\tfalling 2\n\tmoving 5\n\tsetting 1\n\tspiking 1\n\tstanding 33\n\twaiting 3\nTemporal window = 5 with step = 1\n\nCreating a new dataset\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/test-leveldb/\n\t(H, W, C) = 256 256 3\nWARNING: Logging before InitGoogleLogging() is written to STDERR\nI0612 16:35:57.784998 14753 leveldb-writer.cpp:52] Opening leveldb /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/test-leveldb/\nCreating a new dataset\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-leveldb/\n\t(H, W, C) = 256 256 3\nI0612 16:35:57.907624 14753 leveldb-writer.cpp:52] Opening leveldb /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-leveldb/\nExtracting shuffled elements from test Data Set. Total videos = 1\nTotal images for current data set is 2. Overall entries will be <= 120\nE0612 16:36:01.151640 14753 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/test-leveldb/: Processed 120 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/test-leveldb/\nTotal Records 120\n*********************************************************\nKey = 0\t => Value 100 instances\nKey = 1\t => Value 5 instances\nKey = 2\t => Value 10 instances\nKey = 3\t => Value 5 instances\n*********************************************************\nKey = 0\t => Value 83.3 %\nKey = 1\t => Value 4.2 %\nKey = 2\t => Value 8.3 %\nKey = 3\t => Value 4.2 %\nExtracting shuffled elements from trainval Data Set. Total videos = 1\nTotal images for current data set is 2. Overall entries will be <= 120\nE0612 16:36:04.130539 14753 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-leveldb/: Processed 120 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/trainval-leveldb/\nTotal Records 120\n*********************************************************\nKey = 0\t => Value 65 instances\nKey = 1\t => Value 5 instances\nKey = 2\t => Value 15 instances\nKey = 4\t => Value 15 instances\nKey = 5\t => Value 5 instances\nKey = 6\t => Value 5 instances\nKey = 7\t => Value 10 instances\n*********************************************************\nKey = 0\t => Value 54.2 %\nKey = 1\t => Value 4.2 %\nKey = 2\t => Value 12.5 %\nKey = 4\t => Value 12.5 %\nKey = 5\t => Value 4.2 %\nKey = 6\t => Value 4.2 %\nKey = 7\t => Value 8.3 %\n\n\nBye: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/exePhase1_2\n========================\nPhase 1 - B - Computing Mean of Network 1 Training Data - /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\nComputing image mean for trainval dataset:  examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\nDone.\n========================\nPhase 1 - C - Network 1 Training\nRunning Caffe using GPU In Directory  examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1\n------------------------------------------------------\nPhase 2 - A - Generating Data to be Fused - /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse\nStart: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/exePhase1_2\nData Fusion for LSTM 2\nLoading the dataset...\nPreparing Dataset: train\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/train.txt\n\n\n************************\n\nPreparing Dataset: val\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/val.txt\n\n\n************************\n\nPreparing Dataset: test\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/test.txt\n41 is processed\n\n\n************************\n\nPreparing Dataset: trainval\n\tfrom config file: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/dataset-config-simple/trainval.txt\n39 is processed\n\n\n************************\n\ntrain dataset is EMPTY\nval dataset is EMPTY\nTotal frames for dataset test = 2\nTotal frames for dataset trainval = 2\n\nTotal videos = 2 - total frames = 4\n\nScenes Labels:\n\tl-pass 0\n\tr_spike 1\n\nPersons Labels:\n\tblocking 5\n\tdigging 1\n\tfalling 7\n\tmoving 2\n\tsetting 3\n\tspiking 6\n\tstanding 0\n\twaiting 4\n\nScenes Labels frequency:\n\tl-pass 2\n\tr_spike 2\n\nPlayers Labels frequency:\n\tblocking 1\n\tdigging 2\n\tfalling 2\n\tmoving 5\n\tsetting 1\n\tspiking 1\n\tstanding 33\n\twaiting 3\nTemporal window = 10 with step = 1\n\nCreating a new dataset\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/test-leveldb/\n\t(H, W, C) = 256 256 3\nWARNING: Logging before InitGoogleLogging() is written to STDERR\nI0612 16:38:30.898288 14832 leveldb-writer.cpp:52] Opening leveldb /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/test-leveldb/\nCreating a new dataset\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/trainval-leveldb/\n\t(H, W, C) = 256 256 3\nI0612 16:38:31.156790 14832 leveldb-writer.cpp:52] Opening leveldb /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/trainval-leveldb/\nExtracting shuffled elements from test Data Set. Total videos = 1\nTotal images for current data set is 2. Overall entries will be = 240\nE0612 16:38:38.010151 14832 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/test-leveldb/: Processed 240 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/test-leveldb/\nTotal Records 240\n*********************************************************\nKey = 0\t => Value 120 instances\nKey = 1\t => Value 120 instances\n*********************************************************\nKey = 0\t => Value 50.0 %\nKey = 1\t => Value 50.0 %\nExtracting shuffled elements from trainval Data Set. Total videos = 1\nTotal images for current data set is 2. Overall entries will be = 240\nE0612 16:38:44.260620 14832 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/trainval-leveldb/: Processed 240 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p2-ready-fuse/trainval-leveldb/\nTotal Records 240\n*********************************************************\nKey = 0\t => Value 120 instances\nKey = 1\t => Value 120 instances\n*********************************************************\nKey = 0\t => Value 50.0 %\nKey = 1\t => Value 50.0 %\n\n\nBye: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/exePhase1_2\n========================\nPhase 2 - B - Creating Mean File of Fused Data\nComputing image mean for dataset:  trainval\n------------------------------------------------------\nPhase 3 - Generarting LSTM 2 Data - /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\nE0612 16:38:56.145875 14855 exePhase3.cpp:346] Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\n\nE0612 16:38:56.303081 14855 exePhase3.cpp:166] Fusing style = max_pool_players_2\n\nE0612 16:38:56.303119 14855 exePhase3.cpp:169] frames_window = 10\nE0612 16:38:56.303136 14855 exePhase3.cpp:171] Expected batch size = 120\nE0612 16:38:56.303158 14855 exePhase3.cpp:184] Using CPU\nE0612 16:38:56.303218 14855 exePhase3.cpp:191] Model: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/z_snapshot_iter_1.caffemodel\nE0612 16:38:56.303242 14855 exePhase3.cpp:192] Proto: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p3-extract-features-networks/trainval.prototxt\nE0612 16:38:56.303267 14855 exePhase3.cpp:194] Creating the test network\nE0612 16:39:02.240625 14855 exePhase3.cpp:197] Loading the Model\n[libprotobuf WARNING google/protobuf/io/coded_stream.cc:505] Reading dangerously large protocol message.  If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons.  To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.\n[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 568239168\nE0612 16:39:17.920809 14855 exePhase3.cpp:206] # of blobs is 2\nE0612 16:39:17.920907 14855 exePhase3.cpp:212] blob_name: fc7\nE0612 16:39:18.012258 14855 exePhase3.cpp:212] blob_name: lstm1\nE0612 16:39:18.012333 14855 exePhase3.cpp:223] num_mini_batches: 2\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-leveldb\nE0612 16:39:43.161676 14855 exePhase3.cpp:278] \n\nE0612 16:39:43.161788 14855 exePhase3.cpp:285] ith Vector Length = 4096\nE0612 16:39:43.161806 14855 exePhase3.cpp:285] ith Vector Length = 3000\nE0612 16:39:43.165956 14855 exePhase3.cpp:311] Fused Vector Length = 14192\nE0612 16:40:10.215795 14855 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-leveldb: Processed 20 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-leveldb\nTotal Records 20\n*********************************************************\nKey = 0\t => Value 10 instances\nKey = 1\t => Value 10 instances\n*********************************************************\nKey = 0\t => Value 50.0 %\nKey = 1\t => Value 50.0 %\nE0612 16:40:10.326062 14855 exePhase3.cpp:356] \n\nSuccessfully extracted the features!\n\nE0612 16:40:10.326143 14855 exePhase3.cpp:166] Fusing style = max_pool_players_2\n\nE0612 16:40:10.326162 14855 exePhase3.cpp:169] frames_window = 10\nE0612 16:40:10.326179 14855 exePhase3.cpp:171] Expected batch size = 120\nE0612 16:40:10.326197 14855 exePhase3.cpp:184] Using CPU\nE0612 16:40:10.326225 14855 exePhase3.cpp:191] Model: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p1-network1/z_snapshot_iter_1.caffemodel\nE0612 16:40:10.326256 14855 exePhase3.cpp:192] Proto: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p3-extract-features-networks/test.prototxt\nE0612 16:40:10.326279 14855 exePhase3.cpp:194] Creating the test network\nE0612 16:40:18.722841 14855 exePhase3.cpp:197] Loading the Model\n[libprotobuf WARNING google/protobuf/io/coded_stream.cc:505] Reading dangerously large protocol message.  If the message turns out to be larger than 2147483647 bytes, parsing will be halted for security reasons.  To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.\n[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 568239168\nE0612 16:40:22.197417 14855 exePhase3.cpp:206] # of blobs is 2\nE0612 16:40:22.197490 14855 exePhase3.cpp:212] blob_name: fc7\nE0612 16:40:22.197540 14855 exePhase3.cpp:212] blob_name: lstm1\nE0612 16:40:22.197576 14855 exePhase3.cpp:223] num_mini_batches: 2\n\n\nCreates a database at: /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/test-leveldb\nE0612 16:41:19.411211 14855 leveldb-writer.cpp:223] /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/test-leveldb: Processed 20 files.\n\nLabels Statistics for db /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/test-leveldb\nTotal Records 20\n*********************************************************\nKey = 0\t => Value 10 instances\nKey = 1\t => Value 10 instances\n*********************************************************\nKey = 0\t => Value 50.0 %\nKey = 1\t => Value 50.0 %\nE0612 16:41:19.490344 14855 exePhase3.cpp:356] \n\nSuccessfully extracted the features!\n\n------------------------------------------------------\nPhase 4 - A - LSTM 2 Training - /home/mostafa/workspaces/git/caffe-lstm/examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\nRunning Caffe using GPU In Directory  examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2\n========================\nPhase 4 - B - Temporal Evaluation\nE0612 16:41:46.152066 15089 exePhase4.cpp:252] Make sure to have LD_LIBRARY_PATH pointing to LSTM implementation in case of LSTM\n\nE0612 16:41:46.169050 15089 exePhase4.cpp:146] Temporal Window = 10\nE0612 16:41:46.169098 15089 exePhase4.cpp:159] Using CPU\nE0612 16:41:46.169178 15089 exePhase4.cpp:166] Model: examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/z_snapshot_iter_2.caffemodel\nE0612 16:41:46.169193 15089 exePhase4.cpp:167] Proto: examples/deep-activity-rec/ibrahim16-cvpr-simple/p4-network2/trainval-test-window-evaluation-network.prototxt\nE0612 16:41:46.169231 15089 exePhase4.cpp:169] Creating the test network\nE0612 16:41:48.182070 15089 exePhase4.cpp:172] Loading the Model\nE0612 16:41:48.637673 15089 exePhase4.cpp:176] blob_name: prop\nE0612 16:41:48.637748 15089 exePhase4.cpp:181] num_mini_batches: 7\n\n\nTest 1: Result = 0 GroundTruth = 0\nTest 2: Result = 7 GroundTruth = 1\nTest 3: Result = 0 GroundTruth = 0\nTest 4: Result = 7 GroundTruth = 1\nTest 5: Result = 0 GroundTruth = 0\nTest 6: Result = 7 GroundTruth = 1\nTest 7: Result = 0 GroundTruth = 0\n\n\nTotal testing frames: 7 with temporal window: 1\nTemporal accuracy : 57.14 %\n\n=======================================================================================\n\nConfusion Matrix - Truth (col) / Result(row)\n\n  T/R:     0    1    7\n=======================================================================================\n    0:     4    0    0 \t=> Total Correct =     4 /     4 = 100.00 %\n    1:     0    0    3 \t=> Total Correct =     0 /     3 = 0.00 %\n    7:     0    0    0 \t=> Total Correct =     0 /     0 = 0.00 %\n\n\n    T/R:       0      1      7\n=======================================================================================\n      0:  100.00   0.00   0.00\n      1:    0.00   0.00 100.00\n      7:    0.00   0.00   0.00\n\nTo get labels corresponding to IDs..see dataset loading logs\n------------------------------------------------------\n\nDONE processing script examples/deep-activity-rec/ibrahim16-cvpr-simple/script-simple.sh\n"
  },
  {
    "path": "ibrahim16-cvpr-simple/script-simple.sh",
    "content": "#!/usr/bin/env bash\n\nCAFFE=/cs/vml2/msibrahi/workspaces/caffe-lstm\n\nGIT_PROJ_DIR=$CAFFE/examples/deep-activity-rec\nDATASET_VIDEOS=$GIT_PROJ_DIR/volleyball-simple\nDATASET_CONFIG=$GIT_PROJ_DIR/dataset-config-simple\nOUTPUT_DIR=$GIT_PROJ_DIR/ibrahim16-cvpr-simple\n\nTRAIN_SRC=trainval\nTEST_SRC=test\n\nWINDOW_NETWORK1=5\nWINDOW_NETWORK2=10\nSTEP=1\n\nGPU_ID=0\nNETWORK1_HIDDEN=3000\nNETWORK1_TRAIN_ITERS=1\n\n# Fusion Styles: Choose 0-7\n# 0 => Conc / 1 group       1 => Max / 1 group        4 => Avg / 1 group        7 => sum / 1 group\n# 2 => Max / 2 groups\t    5 => Avg / 2 groups       3 => Max / 4 groups       6 => Avg / 4 groups    \nFUSION_STYLE=2\nFUSION_TRAIN_ITER=2\nFUSION_TEST_ITER=2\n\nVAR_FUSION_LAYERS_VAL=\"2 fc7 lstm1\"\nVAR_FUSION_LAYERS=\"FUSION_LAYERS\"\ndeclare \"$VAR_FUSION_LAYERS=$VAR_FUSION_LAYERS_VAL\"\n\nNETWORK1_DIR=$OUTPUT_DIR/p1-network1\nNETWORK1_MODEL_PATH=$NETWORK1_DIR/z_snapshot_iter_$NETWORK1_TRAIN_ITERS.caffemodel\nNETWORK2_LEVELDB_FUSION_DIR=$OUTPUT_DIR/p2-ready-fuse\nNETWORK2_EXTRACTION_NETOWRK_DIR=$OUTPUT_DIR/p3-extract-features-networks\nNETWORK2_DIR=$OUTPUT_DIR/p4-network2\n\n# Programs\nEXE_P1_NETWORK1=exePhase1_2\nEXE_P2_FUSE=exePhase1_2\nEXE_P4_NETWORK2=exePhase3\n\n###########################################################################\necho ------------------------------------------------------\necho \necho \"START processing script\" \"$0\"\necho \"OUTPUT Directory is \" $OUTPUT_DIR\necho \necho Doing path VALIDATIONS\n\n## Some directories / files validation\n\n[ -d $CAFFE ]             || echo Directory $CAFFE NOOOT exist\n[ -d $OUTPUT_DIR ]        || echo Directory $OUTPUT_DIR NOOOT exist\n[ -d $DATASET_VIDEOS ]    || echo Directory $DATASET_VIDEOS NOOOT exist\n[ -d $DATASET_CONFIG ]    || echo Directory $DATASET_CONFIG NOOOT exist\n[ -d $NETWORK1_DIR ]      || echo Directory $NETWORK1_DIR NOOOT exist\n[ -d $NETWORK2_EXTRACTION_NETOWRK_DIR ] || echo Directory $NETWORK2_EXTRACTION_NETOWRK_DIR NOOOT exist\n[ -d $NETWORK2_DIR ]      || echo Directory $NETWORK2_DIR NOOOT exist\n\necho READY...STEADY...Gooo ?\nread -t 60\n\ncd $CAFFE\n\n\n\n\n\n\n\n\n\n\n\n###########################################################################\necho ------------------------------------------------------\necho Phase 1 - Generating Network 1 Data - $NETWORK1_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK1_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK1_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK1_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P1_NETWORK1  \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK1_DIR       $WINDOW_NETWORK1    $STEP  1 \\\n  2>&1 |  tee    \\\n  $NETWORK1_DIR/z_log_dataset_net1.txt   \n\n\necho ========================\n\necho Phase 1 - B - Computing Mean of Network 1 Training Data - $NETWORK1_DIR\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-create-mean-script.sh\n\nread -t 10\n\n\n\n\n\n\n\necho ========================\n\n\necho Phase 1 - C - Network 1 Training\n$NETWORK1_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\nread -t 10\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho Phase 2 - A - Generating Data to be Fused - $NETWORK2_LEVELDB_FUSION_DIR\n\nmkdir -p $NETWORK2_LEVELDB_FUSION_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_LEVELDB_FUSION_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P2_FUSE   \\\n  $DATASET_VIDEOS   \\\n  $DATASET_CONFIG   \\\n  $NETWORK2_LEVELDB_FUSION_DIR       $WINDOW_NETWORK2    $STEP  0  \\\n  2>&1 |  tee    \\\n  $NETWORK2_LEVELDB_FUSION_DIR/z_log_dataset_fuse.txt   \n\necho ========================\n\n\n\necho Phase 2 - B - Creating Mean File of Fused Data\necho \"Computing image mean for dataset: \" $TRAIN_SRC\n\n$CAFFE/build/tools/compute_image_mean -backend=leveldb  $NETWORK2_LEVELDB_FUSION_DIR/$TRAIN_SRC-leveldb $NETWORK2_LEVELDB_FUSION_DIR/mean.binaryproto\n\n\nread -t 10\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n# Info: # iterations (e.g. 10863 = 17 * 639. 639 is the # of test cases. \n# Info: Inside the prototxt, a batch # equal to # of persons (e.g. 5). 17 = 2 * 8 +1. 8 is the right temporal width\n\necho Phase 3 - Generarting LSTM 2 Data - $NETWORK2_DIR\n\n# Clean if some previous wrong data\n[ -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ ! -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TRAIN_SRC-leveldb\n[ ! -d $NETWORK2_DIR/$TRAIN_SRC-leveldb ]  && [ -d $NETWORK2_DIR/$TEST_SRC-leveldb ] && rm -r $NETWORK2_DIR/$TEST_SRC-leveldb\n\n$GIT_PROJ_DIR/$EXE_P4_NETWORK2     \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID     \\\n   $NETWORK1_MODEL_PATH   \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TRAIN_SRC.prototxt   \\\n   $FUSION_LAYERS     \\\n   $NETWORK2_DIR/$TRAIN_SRC-leveldb   \\\n   $FUSION_TRAIN_ITER   \\\n   $FUSION_STYLE    $WINDOW_NETWORK2      GPU $GPU_ID    \\\n   $NETWORK1_MODEL_PATH    \\\n   $NETWORK2_EXTRACTION_NETOWRK_DIR/$TEST_SRC.prototxt    \\\n   $FUSION_LAYERS         \\\n   $NETWORK2_DIR/$TEST_SRC-leveldb    \\\n   $FUSION_TEST_ITER   \\\n   2>&1 |  tee    \\\n   $NETWORK2_DIR/z_log_dataset_net2.txt\n\nread -t 10\n\n\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\n\necho Phase 4 - A - LSTM 2 Training - $NETWORK2_DIR\n\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-exe-script.sh\n\n\n\n\n\necho ========================\n\n\necho Phase 4 - B - Temporal Evaluation\n$NETWORK2_DIR/$TRAIN_SRC-$TEST_SRC-window-evaluation-exe-script.sh\n\n\n\n\n\n\n\n\n\n###############\necho ------------------------------------------------------\necho \necho DONE processing script \"$0\"\necho \necho ------------------------------------------------------\n\n"
  },
  {
    "path": "src/custom-abbreviation.h",
    "content": "/*\n * custom-abbreviation.h\n *\n *  Created on: 2015-06-08\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_ABBREVIATION_H_\n#define CUSTOM_ABBREVIATION_H_\n\n#include <cmath>\n\nnamespace MostCV\n{\n  typedef vector<int>       vi;\n  typedef vector<double>    vd;\n  typedef vector< vi >      vvi;\n  typedef vector< vd >      vvd;\n  typedef vector<string>    vs;\n  typedef long long         ll;\n  typedef long double       ld;\n  //typedef unsigned char   uchar;\n\n  const ll      OO = (ll)1e10;\n  const double    PI  = std::acos(-1.0);\n  const long double   EPS = (1e-15);\n\n  // 4 orthogonal directions, 4 diagonal directions and last is same position\n  //int DR11[9] = {1, 0, 0, -1, 1, 1, -1, -1, 0};\n  //int DC11[9] = {0, 1, -1, 0, -1, 1, -1, 1, 0};\n\n  enum DIRS_ENUM {Left, Right, Bottpm, Top};\n}\n\n\n#endif /* CUSTOM_ABBREVIATION_H_ */\n"
  },
  {
    "path": "src/custom-images-macros.h",
    "content": "/*\n * custom-images-macros.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_IMAGES_MACROS_H_\n#define CUSTOM_IMAGES_MACROS_H_\n\nnamespace MostCV {\n\n#define REPIMG2(y, x, img)       for(int y=0;y<(int)(img.rows);++y) for(int x=0;x<(int)(img.cols);++x)\n#define REPIMG3(y, x, c, img)    for(int y=0;y<(int)(img.rows);++y) for(int x=0;x<(int)(img.cols);++x) for(int c=0;c<(int)(img.channels());++c)\n#define REPIMG_JUMP(y, x, dy, dx, img)       for(int y=0;y<(int)(img.rows);y+=dy) for(int x=0;x<(int)(img.cols);x+=dx)\n}\n\n#endif /* CUSTOM_IMAGES_MACROS_H_ */\n"
  },
  {
    "path": "src/custom-macros.h",
    "content": "/*\n * custom-macros.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef CUSTOM_MACROS_H_\n#define CUSTOM_MACROS_H_\n\nnamespace MostCV {\n\n#define ALL(v)        ((v).begin()), ((v).end())\n#define RALL(v)       ((v).rbegin()), ((v).rend())\n#define SZ(v)         ((int)((v).size()))\n#define CLR(v, d)     memset(v, d, sizeof(v))\n#define REP(i, v)     for(int i=0;i<SZ(v);++i)\n#define REPI(i, j, v)     for(int i=(j);i<SZ(v);++i)\n//#define REPIT(i, c) for(typeof((c).begin()) i = (c).begin(); i != (c).end(); i++)\n#define REPIT(i, c) for(auto i = (c).begin(); i != (c).end(); i++)\n#define LP(i, n)      for(int i=0;i<(int)(n);++i)\n#define LPI(i, j, n)  for(int i=(j);i<(int)(n);++i)\n#define LPD(i, j, n)    for(int i=(j);i>=(int)(n);--i)\n#define REPA(v)       lpi(i, 0, SZ(v)) lpi(j, 0, SZ(v[i]))\n\n// ToDo: http://www.quora.com/What-are-some-macros-that-are-used-in-programming-contests\n}\n\n#endif /* CUSTOM_MACROS_H_ */\n"
  },
  {
    "path": "src/dlib-tracker-wrapper.cpp",
    "content": "/*\n * dlib-tracker-wrapper.cpp\n *\n *  Created on: 2015-06-22\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"dlib-tracker-wrapper.h\"\n#include \"custom-images-macros.h\"\n\n#include <iostream>\nusing std::cerr;\n\nnamespace MostCV {\n\nDlibTrackerWrapper::DlibTrackerWrapper(Rect initial_location) {\n  initial_location_ = initial_location;\n  step_ = 0;\n}\n\nRect DlibTrackerWrapper::UpdateTracker(Mat img) {\n  Rect img_rect = Rect(0, 0, img.cols-1, img.rows-1);\n  cv::Mat gray_img;\n\n  if (CV_8U != img.type() || 1 != img.channels())\n    cv::cvtColor(img, gray_img, cv::COLOR_BGR2GRAY);\n  else\n    gray_img = img;\n\n  dlib::array2d<uchar> dlib_img(gray_img.rows, gray_img.cols);\n\n  REPIMG2(y, x, gray_img)\n      dlib_img[y][x] = gray_img.at<uchar> (y, x);\n\n  if (step_ == 0) {\n    initial_location_ &= img_rect;  // Fix first one in case\n\n    if(initial_location_.area() == 0)\n    {\n      cerr<<\"Dlib: Empty rectangle for tracking! Let's do workaround\\n\";\n\n      initial_location_ = Rect(0, 0, 1, 1);\n    }\n\n    tracker_.start_track(dlib_img, dlib::centered_rect(dlib::point(initial_location_.x + initial_location_.width / 2, initial_location_.y + initial_location_.height / 2),\n                                                 initial_location_.width, initial_location_.height));\n    ++step_;\n    return initial_location_;\n  }\n\n  tracker_.update(dlib_img);\n  int y1 = tracker_.get_position().top();\n  int x1 = tracker_.get_position().left();\n  int y2 = tracker_.get_position().bottom();\n  int x2 = tracker_.get_position().right();\n\n  ++step_;\n\n  Rect rect = Rect(x1, y1, x2-x1, y2-y1);\n\n\n  rect &= img_rect;\n\n  if(rect.area() < 1)   // zero areas usually cause problems. Let's give them 1 area box\n    rect = Rect(0, 0, 1, 1);\n\n  return rect;\n}\n\n// back like: 0 -1 -2 -3  and forward 0 1 2 3 4 5 6. Helps when tracker centered on frame\npair<vector<Mat>, vector<Rect> > DlibTrackerWrapper::Process(vector<Mat> backwardImgs, vector<Mat> forwardImgs)\n{\n  vector<Rect> ret;\n\n  DlibTrackerWrapper backTracker(initial_location_);\n\n  for(auto img: backwardImgs)\n    ret.push_back( backTracker.UpdateTracker(img) );\n\n  if(forwardImgs.size() > 0)\n  {\n    std::reverse(ret.begin(), ret.end());\n    std::reverse(backwardImgs.begin(), backwardImgs.end());\n    backwardImgs.pop_back();\n    ret.pop_back(); // remove the middle, it will be added again. This is initial_location_\n  }\n\n  for(auto img: forwardImgs)\n  {\n    ret.push_back( UpdateTracker(img) );\n    backwardImgs.push_back(img);\n  }\n\n  return std::make_pair(backwardImgs, ret);\n}\n\n}\n"
  },
  {
    "path": "src/dlib-tracker-wrapper.h",
    "content": "/*\n * dlib-tracker-wrapper.h\n *\n *  Created on: 2015-06-22\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef DLIB_TRACKER_WRAPPER_H_\n#define DLIB_TRACKER_WRAPPER_H_\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Ptr;\nusing cv::Scalar;\nusing cv::Rect;\nusing cv::Point;\nusing cv::Size;\n\n#include <dlib/image_processing.h>\n#include <dlib/gui_widgets.h>\n#include <dlib/image_io.h>\n#include <dlib/dir_nav.h>\n\n#include <vector>\nusing std::vector;\nusing std::pair;\n\nnamespace MostCV {\n\nclass DlibTrackerWrapper {\npublic:\n  DlibTrackerWrapper(Rect initial_location);\n\n  Rect UpdateTracker(Mat img);\n  pair<vector<Mat>, vector<Rect> > Process(vector<Mat> backwardImgs, vector<Mat> forwardImgs);\n\nprivate:\n  dlib::correlation_tracker tracker_;\n  Rect initial_location_;\n  int step_;\n};\n\n\n}\n\n#endif /* DLIB_TRACKER_WRAPPER_H_ */\n"
  },
  {
    "path": "src/images-utilities.cpp",
    "content": "#include \"images-utilities.h\"\n\n#include <iostream>\nusing std::cout;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\n\n#include \"custom-images-macros.h\"\n#include \"custom-macros.h\"\n\nnamespace MostCV {\n\nvoid ShowImage(Mat image, int wait, bool bShow, string stringWindowName) {\n  if (bShow) {\n    cv::namedWindow(stringWindowName.c_str(), 1);\n    cv::imshow(stringWindowName.c_str(), image);\n    cv::waitKey(wait);\n  }\n}\n\nvoid RemoveImagePixels(Mat img, Mat mask, bool is_mask_remove_pixel_black, Point shift) {\n  REPIMG2(y, x, mask)\n    {\n      if (mask.at<uchar> (y, x) == 0 && !is_mask_remove_pixel_black)\n        continue;\n\n      if (mask.at<uchar> (y, x) > 0 && is_mask_remove_pixel_black)\n        continue;\n\n      if (img.channels() == 3) {\n        for (int c = 0; c < 3; ++c)\n          img.at<cv::Vec3b> (y + shift.y, x + shift.x)[c] = 0;\n      } else\n        img.at<uchar> (y + shift.y, x + shift.x) = 0;\n    }\n}\n\nvoid FixMask(Mat mask, int threshold) {\n  int cnt = 0;\n\n  REPIMG2(y, x, mask)\n    {\n      if (mask.at<uchar> (y, x) >= threshold) {\n        if (mask.at<uchar> (y, x) != 255)\n          cnt++;\n        mask.at<uchar> (y, x) = 255;\n      } else {\n        if (mask.at<uchar> (y, x) != 0)\n          cnt++;\n        mask.at<uchar> (y, x) = 0;\n      }\n    }\n  //if(cnt)    cout<<\"FixMask: \"<<cnt<<\" pixels\\n\";\n}\n\nvoid Morphology(Mat mask, bool do_open, bool do_close, int open_kernel_sz, int close_kernel_sz) {\n\n  Mat open_element = cv::getStructuringElement(0, Size(open_kernel_sz, open_kernel_sz));\n  Mat close_element = cv::getStructuringElement(0, Size(close_kernel_sz, close_kernel_sz));\n\n  if (do_open)\n    cv::morphologyEx(mask, mask, cv::MORPH_OPEN, open_element);\n\n  if (do_close)\n    cv::morphologyEx(mask, mask, cv::MORPH_CLOSE, close_element);\n}\n\nbool AddButton(Mat controlsMat, string buttonName, vector<Rect> &rectsSoFar, Scalar color) {\n  int lastY = 0;\n  int lastX = 0;\n  Rect imgRect = Rect(0, 0, controlsMat.cols - 1, controlsMat.rows - 1);\n\n  if (rectsSoFar.size()) {\n    Rect r = rectsSoFar.back();\n    lastY = r.y + r.height + 5;\n    lastX = r.x;\n  }\n  Rect r(lastX, lastY, 100, 30);\n\n  if ((r & imgRect) != r) {\n    lastY = 0;\n    lastX = r.x + r.width + 5;\n    r = Rect(lastX, lastY, 100, 30);\n\n    if ((r & imgRect) != r)\n      return false;\n  }\n\n  cv::rectangle(controlsMat, r, Scalar(255, 255, 255), 2);\n  cv::putText(controlsMat, buttonName, Point(r.x + 2, r.y + r.height / 2), cv::FONT_HERSHEY_SIMPLEX, 0.5, color);\n\n  rectsSoFar.push_back(r);\n\n  return true;\n}\n\nvector<Ptr<CComponenets> > GetConnectedComponenets(Mat img, int area_threshold, int pixels_threshold, Scalar lo_diff, Scalar up_diff, int flags) {\n\n  assert(area_threshold > 0 && pixels_threshold > 0);\n\n  Mat uchar_img;\n  Rect img_rect(0, 0, img.cols - 1, img.rows - 1);\n  vector<Ptr<CComponenets> > componenets;\n\n  if (img.channels() > 1)\n    cvtColor(img, uchar_img, CV_BGR2GRAY);\n  else\n    img.copyTo(uchar_img);\n\n  REPIMG2(y, x, uchar_img)\n    {\n      int pixel_value = (int) uchar_img.at<uchar> (y, x);\n\n      if (pixel_value < 1)\n        continue;\n\n      Rect rect;\n      Mat mask = Mat::zeros(uchar_img.rows + 2, uchar_img.cols + 2, CV_8UC1);\n\n      int mask_pixels_cnt = floodFill(uchar_img, mask, Point(x, y), Scalar(0), &rect, lo_diff, up_diff, flags);\n\n      rect &= img_rect;\n\n      if (rect.area() >= area_threshold && mask_pixels_cnt >= pixels_threshold) {\n        Ptr<CComponenets> component = new CComponenets();\n\n        MostCV::FixMask(mask);\n\n        componenets.push_back(component);\n        component->mask = mask(Rect(1, 1, uchar_img.cols, uchar_img.rows));\n        component->mask_pixels_cnt = mask_pixels_cnt;\n        component->rect = rect;\n        component->flood_starting_point = Point(x, y);\n        component->parent_mask_topleft_point = Point(0, 0);\n      }\n    }\n  return componenets;\n}\n\nRect GetInternalBlobRect(Mat mask)\n{\n  assert(mask.type() == CV_8UC1);\n\n  vector<Ptr<MostCV::CComponenets> > comps = MostCV::GetConnectedComponenets(mask);\n\n  if(comps.size() == 0)\n    return Rect(0, 0, 1, 1);\n\n  Rect union_rect = comps[0]->rect;\n\n  REP(i, comps)\n    union_rect |= comps[i]->rect;\n\n  return union_rect;\n}\n\nvector<Point> GetCombinedContour(Mat mask) {\n  vector<vector<Point> > contours;\n  vector<cv::Vec4i> hierarchy;\n  Mat componentCpy;\n\n  mask.copyTo(componentCpy);\n  cv::findContours(componentCpy, contours, hierarchy, CV_RETR_EXTERNAL, CV_CHAIN_APPROX_SIMPLE);\n\n  vector<Point> contoursInOne;\n\n  REP(j, contours)\n    contoursInOne.insert(contoursInOne.end(), contours[j].begin(), contours[j].end());\n\n  return contoursInOne;\n}\n\nRect GetRect(Mat img)\n{\n  return Rect(0, 0, img.cols-1, img.rows-1);\n}\n\nvoid CenterRect(Rect &target_rect, int width, int height)\n{\n  if(width > target_rect.width)\n  {\n    target_rect.x -= (width - target_rect.width)/2;\n    target_rect.width = width;\n  }\n\n  if(height > target_rect.height)\n  {\n    target_rect.y -= (height - target_rect.height)/2;\n    target_rect.height = height;\n  }\n}\n\nbool CmpRectTopLeft(const Rect &a, const Rect &b)\n{\n  if(a.y != b.y)\n    return a.y < b.y;\n  return a.x < b.x;\n}\n\nvoid SaveVideo(vector<Mat> images, string path, int fps)\n{\n  if(images.empty())\n  {\n    std::cerr<<\"ERROR: Empty video\\n\";\n    return;\n  }\n\n  cv::VideoWriter videoObject;\n\n  videoObject.open(path, CV_FOURCC('X','V','I','D'), fps, Size(images[0].cols, images[0].rows), true);\n\n  if(!videoObject.isOpened())\n  {\n    std::cerr<<\"ERROR: Problem in out video path: \"<<path<<\"\\n\";\n    assert(false);\n  }\n\n  for(auto img : images)\n    videoObject<<img;\n}\n\n\n\n\n\n\n\n\n}\n"
  },
  {
    "path": "src/images-utilities.h",
    "content": "/*\n * ImagesHelper.h\n *\n *  Created on: 2015-03-01\n *      Author: mostafa\n */\n\n#ifndef IMAGESHELPER_H_\n#define IMAGESHELPER_H_\n\n#include<string>\n#include<vector>\nusing std::vector;\nusing std::string;\n\n#include \"opencv2/core/core.hpp\"\nusing cv::Mat;\nusing cv::Ptr;\nusing cv::Point;\nusing cv::Rect;\nusing cv::Scalar;\nusing cv::Size;\n\n#include \"custom-images-macros.h\"\n\nnamespace MostCV {\n\nstruct CComponenets {\n  Mat mask;\n  int mask_pixels_cnt;\n  Rect rect;\n  Point flood_starting_point;\n  Point parent_mask_topleft_point;\n};\n\nvoid ShowImage(Mat image, int wait = 0, bool bShow = true, string stringWindowName = \"Image\");\n\nvoid RemoveImagePixels(Mat img, Mat mask, bool is_mask_remove_pixel_black = false, Point shift = Point(0, 0));\n\nvoid FixMask(Mat mask, int threshold = 10);\n\nvoid Morphology(Mat mask, bool do_open = true, bool do_close = true, int open_kernel_sz = 3, int close_kernel_sz = 15);\n\nvector<Ptr<CComponenets> > GetConnectedComponenets(Mat img, int area_threshold = 1, int pixels_threshold = 1, Scalar lo_diff = Scalar(1), Scalar up_diff =\n    Scalar(1), int flags = 4 + (255 << 8));\n\nRect GetRect(Mat img);\n\nRect GetInternalBlobRect(Mat mask);\n\nvoid CenterRect(Rect &target_rect, int width, int height);\n\nvector<Point> GetCombinedContour(Mat mask);\n\nbool AddButton(Mat controlsMat, string buttonName, vector<Rect> &rectsSoFar, Scalar color = Scalar(255, 0, 0));\n\nbool CmpRectTopLeft(const Rect &a, const Rect &b);\n\nvoid SaveVideo(vector<Mat> images, string path, int fps = 25);\n\n////////////////////////////\n\ntemplate<class Type>  Mat ToRowMat(const vector<Type> &row)\n{\n  if(row.size() == 0)\n    return Mat(0, 0, cv::DataType<Type>::type);\n\n  const Type *ptr = &row[0];\n  Mat mat = Mat(1, row.size(), cv::DataType<Type>::type);\n\n  memcpy(mat.data, ptr, row.size()*sizeof(Type));\n\n  //Mat tempMat = Mat(featureVec).t();\n\n  return mat;\n}\n\ntemplate<class Type>  Mat ToColMat(const vector<Type> &col)\n{\n  if(col.size() == 0)\n    return Mat(0, 0, cv::DataType<Type>::type);\n\n  const Type *ptr = &col[0];\n  Mat mat = Mat(col.size(), 1, cv::DataType<Type>::type);\n\n  memcpy(mat.data, ptr, col.size()*sizeof(Type));\n\n  return mat;\n}\n\ntemplate<class Type>  Mat To2DMat(const vector<vector<Type>> & vectors)\n{\n  Mat mat;\n\n  for(auto row : vectors)\n    mat.push_back(ToRowMat(row));\n\n  return mat;\n}\n\n/*\ntemplate<typename function> void perform(function operation, Mat mat) {\n  if(mat.channels() == 2)\n  {\n    REPIMG2(y, x, mat)\n        mat.at<>\n  }\n  else {\n\n  }\n}\n*/\n\n}\n\n#endif /* IMAGESHELPER_H_ */\n"
  },
  {
    "path": "src/leveldb-reader.cpp",
    "content": "/*\n * leveldb-reader.cpp\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n#include<algorithm>\n#include<iostream>\n#include<fstream>\n\n#include \"leveldb-reader.h\"\nusing std::ifstream;\nusing std::ofstream;\nusing std::endl;\nusing std::cout;\n\n#include \"utilities.h\"\n\nMostCV::LevelDBReader::LevelDBReader(const string & database_path, const string & sorted_list_file) {\n  record_idx_ = 0;\n  cache_limit_ = 1000;\n  database_path_ = database_path;\n\n  is_caching = true;\n\n  if (sorted_list_file == \"\")\n    is_caching = false;\n\n  if (is_caching) {\n    ifstream ifs(sorted_list_file.c_str());\n    string line;\n\n    assert(ifs.is_open());\n\n    while (getline(ifs, line)) {\n      int pos = line.find(' ');\n\n      if (pos != -1)\n        line = line.substr(0, pos);\n\n      pos = line.find_last_of('/');\n\n      if (pos != -1)\n        line = line.substr(pos + 1);\n\n      if (line != \"\")\n        vectors_names_.push_back(line);\n    }\n    vector<string> images_names_temp = vectors_names_;\n    std::sort(images_names_temp.begin(), images_names_temp.end());\n\n    assert(images_names_temp == vectors_names_);\n  }\n\n  leveldb::Options options;\n  options.create_if_missing = true;\n  leveldb::Status status = leveldb::DB::Open(options, database_path_, &database_);\n  assert(status.ok());\n\n  database_iter_ = database_->NewIterator(leveldb::ReadOptions());\n  assert(database_iter_ != NULL);\n\n  database_iter_->SeekToFirst();\n}\n\nMostCV::LevelDBReader::~LevelDBReader() {\n  if (database_iter_ != NULL)\n    delete database_iter_;\n\n  if (database_ != NULL)\n    delete database_;\n}\n\nbool MostCV::LevelDBReader::GetNextEntry(string &key, vector<double> &retVec, int &label) {\n  if (!database_iter_->Valid())\n    return false;\n\n  Datum datum;\n  datum.clear_float_data();\n  datum.clear_data();\n  datum.ParseFromString(database_iter_->value().ToString());\n\n  key = database_iter_->key().ToString();\n  label = datum.label();\n\n  int expected_data_size = std::max<int>(datum.data().size(), datum.float_data_size());\n  const int datum_volume_size = datum.channels() * datum.height() * datum.width();\n  if (expected_data_size != datum_volume_size) {\n    cout << \"Something wrong in saved data.\";\n    assert(false);\n  }\n\n  retVec.resize(datum_volume_size);\n\n  const string& data = datum.data();\n  if (data.size() != 0) {\n    // Data stored in string, e.g. just pixel values of 196608 = 256 * 256 * 3\n    for (int i = 0; i < datum_volume_size; ++i)\n      retVec[i] = data[i];\n  } else {\n    // Data stored in real feature vector such as 4096 from feature extraction\n    for (int i = 0; i < datum_volume_size; ++i)\n      retVec[i] = datum.float_data(i);\n  }\n\n  database_iter_->Next();\n  ++record_idx_;\n\n  return true;\n}\n\nbool MostCV::LevelDBReader::GetNextEntryByKey(const string & name, vector<double> &retVec, int &label) {\n\n  if (!is_caching) {\n    cout << \"A sorted file MUST be given. What are you trying to retrive!\\n\";\n    assert(false);\n  }\n\n  if (cache_.count(name)) {\n    retVec = cache_[name];\n    return true;\n  }\n\n  string key;\n  while (GetNextEntry(key, retVec, label)) {\n    if ((int) cache_items_.size() == cache_limit_) {\n      map<string, vector<double> >::iterator it = cache_.find(cache_items_.front());\n\n      assert(it != cache_.end());\n      cache_.erase(it);\n      cache_items_.pop_front();\n    }\n    cache_[vectors_names_[record_idx_ - 1]] = retVec;\n    cache_items_.push_back(vectors_names_[record_idx_ - 1]);\n\n    if (vectors_names_[record_idx_ - 1] == name)\n      return true;\n  }\n\n  cout << \"Reached end of data: Total Records: \" << record_idx_ << \"\\n\";\n  cout << \"Failed to find data for: \" << name << \" in database path: \" << database_path_ << \"\\n\";\n\n  assert(false);  // We failed to retrieve!\n\n  return false;\n}\n\nvoid MostCV::LevelDBReader::Dump(const string & file_path, int featureVectorLimit) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  ofstream ofs(file_path.c_str());\n\n  vector<double> retVec;\n  string key;\n  int label;\n\n  while (GetNextEntry(key, retVec, label)) {\n    ofs << \"key=\" << key << \", label=\" << label << \", features length=\" << retVec.size();\n\n    if (featureVectorLimit > 0) {\n      ofs << \", truncated\";\n      retVec.resize(featureVectorLimit);  // To avoid writing much\n    }\n\n    ofs << \", feature vec= \";\n    for (size_t i = 0; i < retVec.size(); ++i)\n      ofs << retVec[i] << \" \";\n    ofs << \"\\n\";\n  }\n  ofs.close();\n\n  cout << \"\\nDump done: Total Records: \" << record_idx_ << \"\\n\";\n}\n\nvoid MostCV::LevelDBReader::DumpSmall(const string &file_path, int featureVectorLimit, bool make_random) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  ofstream ofs(file_path.c_str());\n\n  vector<double> retVec;\n\n  string key;\n  int label;\n\n  for (int cnt = 0; cnt < 500 && GetNextEntry(key, retVec, label); ++cnt) {\n    ofs << \"key=\" << key << \", label=\" << label << \", features length=\" << retVec.size();\n\n    if (make_random)\n      std::random_shuffle(retVec.begin(), retVec.end());\n\n    if (featureVectorLimit > 0) {\n      ofs << \", truncated\";\n      retVec.resize(featureVectorLimit);  // To avoid writing much\n    }\n\n    ofs << \", feature vec= \";\n    for (size_t i = 0; i < retVec.size(); ++i)\n      ofs << retVec[i] << \" \";\n    ofs << \"\\n\";\n  }\n  ofs.close();\n\n  cout << \"\\nDump done: Total Records: \" << record_idx_ << \"\\n\";\n}\n\nvoid MostCV::LevelDBReader::ReadLabels(vector<int> &labels, int max_rows) {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  labels.clear();\n\n  string key;\n  int label;\n  vector<double> retVec;\n\n  for (int row = 0; GetNextEntry(key, retVec, label); ++row) {\n    if(max_rows != -1 && max_rows == row)\n      break;\n    labels.push_back(label);\n  }\n}\n\nint MostCV::LevelDBReader::GetRecordsCount() {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n\n  string key;\n  int label;\n  vector<double> retVec;\n\n  while (GetNextEntry(key, retVec, label))\n    ;\n\n  return record_idx_;\n}\n\nvoid MostCV::LevelDBReader::SeekToHead() {\n  record_idx_ = 0;\n  database_iter_->SeekToFirst();\n}\n\n\n\n"
  },
  {
    "path": "src/leveldb-reader.h",
    "content": "/*\n * leveldb-reader.h\n *\n *  Created on: 2015-05-21\n *      Author: Moustafa S. Ibrahim\n */\n\n/*\n * The file handles the reading of leveldb files. The database hold set of feature vectors of same length.\n */\n\n#ifndef LEVELDB_READER_H_\n#define LEVELDB_READER_H_\n\n#include <stdio.h>\n\n#include <string>\n#include <vector>\n#include <deque>\n#include <cassert>\n#include <iostream>\n#include <fstream>\n#include <map>\nusing std::map;\nusing std::deque;\nusing std::vector;\nusing std::string;\nusing std::endl;\nusing std::cout;\n\n#include <google/protobuf/text_format.h>\n#include <glog/logging.h>\n\n#include <leveldb/db.h>\n#include <leveldb/write_batch.h>\n\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/blob.hpp\"\n#include \"caffe/common.hpp\"\n#include \"caffe/net.hpp\"\n#include \"caffe/proto/caffe.pb.h\"\n#include \"caffe/util/db.hpp\"\n#include \"caffe/util/io.hpp\"\n#include \"caffe/vision_layers.hpp\"\n\nusing caffe::Blob;\nusing caffe::Caffe;\nusing caffe::Datum;\nusing caffe::Net;\n\nnamespace MostCV {\n\n/*\n * The class opens a leveldb directory, which has set of feature vectors (e.g. extracted by feature_extract tool from caffe tool).\n * In order, each feature vector has a name is a given in a \"sorted\" file.\n * User either can retrieve all feature vectors in order or filter based on name.\n *\n * User is expected to use one type only of the GetNextEntry methods. Similarly, if user used Dump method, shouldn't try to use other methods.\n * Reason behind such limitation: All the methods seek in the file. E.g., after dumping, no more rows to read.\n *\n * Usage Example:\n *\n * LevelDBReader reader(database_path, sorted_images_list_file);\n * vector<double> feature_vector;\n *\n * while(reader.GetNextEntry())\n *  doSomething(feature_vector);\n *\n */\nclass LevelDBReader {\npublic:\n  /*\n   * Open and prepare the database for reading. The database is allowed to have more rows than the file such that extra rows has no corresponding name.\n   *\n   * The file names should be sorted. Reason behind that is allowing efficient retrieval (e.g. using caching to last 200 rows). As a result, leveldb should be sorted too based on this key.\n   *\n   * In case no file given, then Just retrieve sequentially from DB. This is more suitable for dumping purposes.\n   */\n  LevelDBReader(const string & database_path, const string & sorted_list_file = \"\");\n  ~LevelDBReader();\n\n  // Read the next entry from the file. If no more rows, return false.\n  bool GetNextEntry(string &key, vector<double> &retVec, int &label);\n\n  // Given entry name from the sorted_images_list_file, return corresponding vector. Consecutive calls should be ordered in name.\n  //    If not so, it shouldn't be far from the last sorted element to be retrieved from caching. We cache last X elements.\n  bool GetNextEntryByKey(const string & name, vector<double> &retVec, int &label);\n\n  // For debugging purposes, dump the database to a file. Truncate after the first \"limit\" elements.\n  void Dump(const string &file_path, int featureVectorLimit = -1);\n  void DumpSmall(const string &file_path, int featureVectorLimit = -1, bool make_random = true);\n  void ReadLabels(vector<int> &labels, int max_rows = -1);\n  int GetRecordsCount();\n  void SeekToHead();\n\nprivate:\n  bool is_caching;\n  vector<string> vectors_names_;\n  string database_path_;\n\n  leveldb::DB* database_;\n  leveldb::Iterator* database_iter_;\n\n  // Caching Variables\n  map<string, vector<double> > cache_;\n  deque<string> cache_items_;\n  int cache_limit_;\n\n  // Current row index in retrieval\n  int record_idx_;\n};\n\n}\n\n#endif /* LEVELDB_READER_H_ */\n"
  },
  {
    "path": "src/leveldb-writer.cpp",
    "content": "/*\n * LeveldbWriter.cpp\n *\n *  Created on: 2015-04-02\n *      Author: Moustafa S. Ibrahim\n */\n\n#include <iostream>\n\n#include \"leveldb-writer.h\"\nusing std::cerr;\nusing std::cout;\n\n#include \"utilities.h\"\n\nconst int WRITING_LIMIT = 1000;\n\nnamespace MostCV {\n\nLeveldbWriter::LeveldbWriter(string db_path_, int resize_height_, int resize_width_, int volumeSize, bool is_virtual_) {\n  max_label_cnt = -1;\n  db_path = db_path_;\n  resize_height = resize_height_;\n  resize_width = resize_width_;\n  volume_size = volumeSize;\n  is_virtual = is_virtual_;\n\n  cerr<<\"\\n\\nCreates a database at: \"<<db_path_<<\"\\n\";\n\n  if(is_virtual_)\n    cerr<<\"\\tUing VIRTUAL MODE dataset\\n\\n\";\n\n  countId = 0;\n  lastCountId = 0;\n  internal_idx = 1;\n\n  if (resize_height > 0) {  // then something already defined for the shape\n    datum.set_channels(volume_size);\n    datum.set_height(resize_height);\n    datum.set_width(resize_width);\n\n    cerr<<\"\\t(H, W, C) = \"<<resize_height<<\" \"<<resize_width<<\" \"<<volume_size<<\"\\n\";\n  }\n  if(!is_virtual) {\n    // leveldb\n    leveldb::Options options;\n    options.error_if_exists = true;\n    options.create_if_missing = true;\n    options.write_buffer_size = 268435456;\t// 8 * 32 * 1024 * 1024\n\n    // Open db\n    LOG(INFO)<< \"Opening leveldb \" << db_path;\n    leveldb::Status status = leveldb::DB::Open(options, db_path, &db);\n    CHECK(status.ok()) << \"Failed to open leveldb \" << db_path << \". Is it already existing?\";\n    batch = new leveldb::WriteBatch();\n  }\n\n  is_closed = false;\n}\n\nLeveldbWriter::~LeveldbWriter() {\n  forceFinalize();\n}\n\nvoid LeveldbWriter::clearDatum() {\n  assert(!is_closed);\n\n  datum.clear_data();\n  datum.clear_float_data();\n}\n\nvoid LeveldbWriter::setLabelsRange(int max_label_cnt) {\n  assert(!is_closed);\n\n  this->max_label_cnt = max_label_cnt;\n}\n\nvoid LeveldbWriter::setDatumLabel(int id) {\n  assert(!is_closed);\n\n  assert(id >= 0);\n\n  if (max_label_cnt != -1 && id >= max_label_cnt) {\n    cerr << \"Wrong label! (Received, expected) = \" << id << \" - \" << max_label_cnt << \"\\n\";\n    assert(false);\n  }\n\n  datum.set_label(id);\n  labels.insert(id);\n  labelsVec.push_back(id);\n}\n\nvoid LeveldbWriter::addDatumToBatch(string key) {\n  assert(!is_closed);\n\n  if (key != \"\" && keys.insert(key).second == false)\n    cerr << \"Warning: key duplication: \" << key << \"\\n\";\n\n  if(is_virtual)  return;\n\n  string value;\n  assert(datum.SerializeToString(&value));\n\n  string prefix = MostCV::toIntStr(\"0000000\", internal_idx++) + \"@\";\n  batch->Put(prefix + key, value);\n\n  if (++countId % WRITING_LIMIT == 0)\n    writeBatch();\n\n  clearDatum();\n}\n\nvoid LeveldbWriter::addDatumToBatch(caffe::Datum &datum, string key, int label) {\n  assert(!is_closed);\n\n  if (keys.insert(key).second == false)\n    cerr << \"Warning: Key duplication: \" << key << \"\\n\";\n\n  assert(label >= 0);\n\n  string value;\n  datum.set_label(label);\n  labels.insert(label);\n  labelsVec.push_back(label);\n\n  if(is_virtual)  return;\n\n  assert(datum.SerializeToString(&value));\n\n  string prefix = MostCV::toIntStr(\"0000000\", internal_idx++) + \"@\";\n  batch->Put(prefix + key, value);\n\n  if (++countId % WRITING_LIMIT == 0)\n    writeBatch();\n\n  clearDatum();\n}\n\nbool LeveldbWriter::addVectorDatum(const vector<double> &feature_vec) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  clearDatum();\n\n  if (resize_height <= 0) {  // use first vector to define the outline\n    datum.set_height(resize_height = feature_vec.size());\n    datum.set_channels(1);\n    datum.set_width(1);\n  } else\n    assert((int )feature_vec.size() == resize_height * resize_width * volume_size);\n\n  for (int p = 0; p < (int) feature_vec.size(); ++p)\n    datum.add_float_data(feature_vec[p]);\n\n  return true;\n}\n\nbool LeveldbWriter::addImageToDatum(Mat imgMat_origin, int num_channels) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  assert(resize_width > 0 && resize_height > 0);\n  assert(imgMat_origin.channels() == num_channels);  // Weird to send it :D\n\n  Mat imgMat;\n  cv::resize(imgMat_origin, imgMat, Size(resize_width, resize_height));\n  // add to db: 256 * 256 * 3 = 196608\n\n  string* datum_string = datum.mutable_data();\n\n  if (num_channels == 3) {\n    for (int c = 0; c < num_channels; ++c) {\n      for (int h = 0; h < imgMat.rows; ++h) {\n        for (int w = 0; w < imgMat.cols; ++w) {\n          datum_string->push_back(static_cast<uint8_t>(imgMat.at<cv::Vec3b>(h, w)[c]));\n        }\n      }\n    }\n  } else {\n    for (int h = 0; h < imgMat.rows; ++h) {\n      for (int w = 0; w < imgMat.cols; ++w) {\n        datum_string->push_back(static_cast<uint8_t>(imgMat.at<uchar>(h, w)));\n      }\n    }\n  }\n\n  return true;\n}\n\nbool LeveldbWriter::addImageToDatum(const string& filename, int num_channels) {\n  assert(!is_closed);\n\n  if(is_virtual)  return true;\n\n  int cv_read_flag = (num_channels == 3 ? CV_LOAD_IMAGE_COLOR : CV_LOAD_IMAGE_GRAYSCALE);\n\n  Mat imgMat_origin = cv::imread(filename, cv_read_flag);\n\n  if (!imgMat_origin.data) {\n    LOG(ERROR)<< \"Could not open or find file \" << filename;\n    return false;\n  }\n  return addImageToDatum(imgMat_origin, num_channels);\n}\n\nvoid LeveldbWriter::writeBatch() {\n  if (is_closed)\n    return;\n\n  if(is_virtual)  return;\n\n  if (countId == lastCountId)  // nothing changed\n    return;\n\n  leveldb::Status status = db->Write(leveldb::WriteOptions(), batch);\n  CHECK(status.ok()) << \"Failed to write the batch. Count id:  \" << countId << \"\\n\";\n\n  delete batch;\n  batch = new leveldb::WriteBatch();\n\n  LOG(ERROR)<<db_path<<\": Processed \" << countId << \" files.\";\n  lastCountId = countId;\n}\n\nvoid LeveldbWriter::forceFinalize() {\n  if (is_closed)\n    return;\n\n  if(!is_virtual) {\n\n    // write the last batch\n    if (countId % WRITING_LIMIT != 0)\n      writeBatch();\n\n    if (batch != NULL)\n      delete batch;\n\n    if (db != NULL)\n      delete db;\n  }\n\n  if (labels.size() == 1)  // Zero case, means caller not interested in setting labels. Just dummy labels.\n    cerr << \"\\n\\n\\nThere is only ONE label in database. There should be a bug\\n\";\n\n  cerr<<\"\\nLabels Statistics for db \"<<db_path<<\"\\n\";\n\n  cerr<<\"Total Records \"<<labelsVec.size()<<\"\\n\";\n\n  cerr<<\"*********************************************************\\n\";\n\n  MostCV::getFrequencyMap(labelsVec, true);\n\n  cerr<<\"*********************************************************\\n\";\n\n  MostCV::getFrequencyMapPercent(labelsVec, true);\n\n  is_closed = true;\n}\n\n}\n"
  },
  {
    "path": "src/leveldb-writer.h",
    "content": "/*\n * LeveldbWriter.h\n *\n *  Created on: 2015-04-02\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef LeveldbWriter_H_\n#define LeveldbWriter_H_\n\n#include <string>\n#include <set>\n#include <vector>\nusing std::vector;\nusing std::set;\nusing std::string;\n\n#include <glog/logging.h>\n#include <leveldb/db.h>\n#include <leveldb/write_batch.h>\n#include \"caffe/proto/caffe.pb.h\"\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Size;\n\n\nnamespace MostCV {\n\nclass LeveldbWriter {\npublic:\n  // Using zero parameters would mean not interested to add addImageToDatum functionality.\n  LeveldbWriter(string db_path, int resize_height = -1, int resize_width = 1, int volumeSize = 1, bool is_virtual = false);\n  ~LeveldbWriter();\n\n  void clearDatum();\n  void setDatumLabel(int id);\n  bool addImageToDatum(const string& filename, int num_channels);\n  bool addImageToDatum(Mat img, int num_channels);\n\n  bool addVectorDatum(const vector<double> &feature_vec);\n  void addDatumToBatch(string key = \"\");\n  void addDatumToBatch(caffe::Datum &datum, string key, int label);\n\n  void setLabelsRange(int max_label_cnt);\n  void forceFinalize();\n\nprivate:\n  void writeBatch();\n\n\n  leveldb::DB* db;\n  leveldb::WriteBatch* batch;\n  caffe::Datum datum;\n  int countId;\n  int lastCountId;\n\n  string db_path;\n  int resize_height;\n  int resize_width;\n  int volume_size;\n  int internal_idx;\n\n  set<int> labels;          //helps in verification.\n  vector<int> labelsVec;    // print purposes\n  int max_label_cnt;\n  set<string> keys;\n  bool is_closed;\n  bool is_virtual;\n};\n\n}\n\n#endif /* LeveldbWriter_H_ */\n"
  },
  {
    "path": "src/rect-helper.cpp",
    "content": "/*\n * RectHelper.cpp\n *\n *  Created on: 2015-07-06\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"rect-helper.h\"\n\n#include \"images-utilities.h\"\n#include \"utilities.h\"\n#include \"custom-macros.h\"\n\nnamespace MostCV {\n\nRectHelper::RectHelper(Rect rect, double score) {\n  r = rect;\n  conf_score = score;\n  color = Scalar(rand() % 256, rand() % 256, rand() % 256); // random color\n}\n\nvector<RectHelper> RectHelper::ToRectHelpers(const vector<Rect> &rectangles_vec) {\n  vector<RectHelper> ret;\n\n  for(auto rect : rectangles_vec)\n    ret.push_back(RectHelper(rect));\n\n  return ret;\n}\n\nvector<Rect> RectHelper::ToRects(const vector<RectHelper> &rectangles_vec)\n{\n  vector<Rect> ret;\n\n  for(auto rect : rectangles_vec)\n    ret.push_back(rect.r);\n\n  return ret;\n}\n\n//////////////////////////// Static Methods /////////////////////////////\n\nvoid RectHelper::DrawRects(Mat img, const vector<RectHelper> &rectangles_vec, bool is_make_copy, bool is_show, Scalar color) {\n  Mat imgTemp;\n\n  if (is_make_copy) {\n    img.copyTo(imgTemp);\n    img = imgTemp;\n  }\n\n  for (auto rect_helper : rectangles_vec)\n    cv::rectangle(img, rect_helper.r, (color[0] == -1) ? rect_helper.color : color, 2);\n\n  int maxArea = 600 * 800;\n  int dif = sqrt(img.rows * img.cols / maxArea);\n\n  if(dif > 1)\n  {\n    Size size(img.cols / dif, img.rows / dif);\n    Mat toImg;\n    cv::resize(img, toImg, size);\n    img = toImg;\n  }\n\n  MostCV::ShowImage(img, 0, is_show);\n}\n\nmap<string, vector<RectHelper> > RectHelper::LoadImagesRectangles(string path_x1_y1_w_h){\n  map<string, vector<RectHelper> > retMap;\n\n  ifstream ifs(path_x1_y1_w_h);\n\n  int cnt;\n  string image_name;\n\n  while(ifs>>image_name>>cnt)\n  {\n    vector<RectHelper> rectHelpers;\n\n    while(cnt--)\n    {\n      double x, y, w, h;\n      double score;\n      ifs>>x>>y>>w>>h>>score;\n\n      rectHelpers.push_back(RectHelper(Rect(x, y, w, h), score));\n    }\n    retMap[image_name] = rectHelpers;\n  }\n  ifs.close();\n\n  return retMap;\n}\n\nvoid RectHelper::WriteImagesRectangles(const map<string, vector<RectHelper> > &image_rect_helpers_Map, string path_x1_y1_w_h)\n{\n  ofstream ofs(path_x1_y1_w_h);\n\n  for (auto img_rects_pair : image_rect_helpers_Map)\n  {\n    ofs<<img_rects_pair.first<<\" \"<<img_rects_pair.second.size();\n\n    for (auto rectHelper: img_rects_pair.second)\n      ofs<<\" \"<<rectHelper.r.x<<\" \"<<rectHelper.r.y<<\" \"<<rectHelper.r.width<<\" \"<<rectHelper.r.height<<\" \"<<rectHelper.conf_score;\n    ofs<<\"\\n\";\n  }\n  ofs.close();\n}\n\nvoid RectHelper::FilterBelowConfidenceThreshold(vector<RectHelper> &rects, double conf_score_threshold)\n{\n  for (size_t i = 0; i < rects.size(); ++i) {\n    if(MostCV::dcmp(rects[i].conf_score, conf_score_threshold) < 0)\n    {\n      rects.erase(rects.begin() + i);\n      --i;\n    }\n  }\n}\n\n\nbool __CmpSortByConfidence(const RectHelper &a, const RectHelper& b)\n{\n  return MostCV::dcmp(a.conf_score, b.conf_score) < 0;\n}\n\nvoid RectHelper::SortByConfidence(vector<RectHelper> &rects)\n{\n  sort(RALL(rects), __CmpSortByConfidence);\n}\n\nbool __CmpSortByTopLeftPoint(const RectHelper &a, const RectHelper& b)\n{\n  int d = MostCV::dcmp(a.r.x, b.r.x);\n\n  if(d != 0)\n    return d < 0;\n  return MostCV::dcmp(a.r.y, b.r.y) < 0;\n}\n\nvoid RectHelper::SortByTopLeftPoint(vector<RectHelper> &rects)\n{\n  sort(RALL(rects), __CmpSortByTopLeftPoint);\n}\n\n}\n"
  },
  {
    "path": "src/rect-helper.h",
    "content": "/*\n * RectHelper.h\n *\n *  Created on: 2015-07-06\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef RECTHELPER_H_\n#define RECTHELPER_H_\n\n#include <iostream>\n#include <fstream>\n#include <vector>\n#include <string>\n#include <map>\nusing std::vector;\nusing std::map;\nusing std::string;\nusing std::endl;\nusing std::cout;\nusing std::ifstream;\nusing std::ofstream;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Scalar;\nusing cv::Rect;\nusing cv::Point;\nusing cv::Size;\n\nnamespace MostCV {\n\nclass RectHelper {\npublic:\n\n  RectHelper(Rect rect = Rect(0, 0, 0, 0), double score = -1);\n\n  static vector<RectHelper> ToRectHelpers(const vector<Rect> &rectangles_vec);\n  static vector<Rect> ToRects(const vector<RectHelper> &rectangles_vec);\n  static void DrawRects(Mat img, const vector<RectHelper> &rectangles_vec, bool is_make_copy = true, bool is_show = true, Scalar color = Scalar(-1, -1, -1));\n  static void SortByConfidence(vector<RectHelper> &rects);\n  static void SortByTopLeftPoint(vector<RectHelper> &rects);\n  static void FilterBelowConfidenceThreshold(vector<RectHelper> &rects, double conf_score_threshold);\n  static map<string, vector<RectHelper> > LoadImagesRectangles(string path_x1_y1_w_h);\n  static void WriteImagesRectangles(const map<string, vector<RectHelper> > &imageRectHelpersMap, string path_x1_y1_w_h);\n\n  Rect r;\n  double conf_score;\n  string category;  // E.g. Car bbox\n  int category_idx;\n  Scalar color;   // For drawing\n\n  Mat image;  // Image the rectangle belong to it\n  string image_name;\n  string image_path;\n  string image_parent_path;\n};\n\n}\n\n#endif /* RECTHELPER_H_ */\n"
  },
  {
    "path": "src/utilities.cpp",
    "content": "/*\n * Utilities.cpp\n *\n *  Created on: 2015-03-13\n *      Author: Moustafa S. Ibrahim\n */\n\n#include \"utilities.h\"\n\n#include <stdio.h>\n#include <stdlib.h>\n\n#include <cstring>\n#include <cmath>\nusing std::memcpy;\nusing std::fabs;\n\n#include <boost/filesystem.hpp>\n#include <boost/algorithm/string/predicate.hpp>\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\nnamespace bst_fs = boost::filesystem;\nusing namespace boost::filesystem;\n\n#include \"custom-abbreviation.h\"\n\nnamespace MostCV {\n\nint dcmp(double x, double y) {\n\treturn fabs(x - y) <= EPS ? 0 : x < y ? -1 : 1;\n}\n\nmap<string, int> BuildStringIdMap(set<string> classes) {\n\tmap<string, int> classId;\n\n\tREPIT(strIt, classes)\n\t{\n\t\tstring str = *strIt;\n\n\t\tif (classId.count(str) == 0) {\n\t\t\tint sz = classId.size();\n\t\t\tclassId[str] = sz;\n\t\t}\n\t}\n\n\treturn classId;\n}\n\nmap<string, int> BuildStringIdMap(vector<string> classesVec) {\n\n\tset<string> classes(classesVec.begin(), classesVec.end());\n\n\treturn BuildStringIdMap(classes);\n}\n\nint UpdateStringIdMap(map<string, int> &classId, string str) {\n\tif (classId.count(str) == 0) {\n\t\tint sz = classId.size();\n\t\tclassId[str] = sz;\n\t\treturn sz;\n\t}\n\treturn classId[str];\n}\n\ndouble round(double d, int precision) {\n\tostringstream oss;\n\toss.setf(std::ios::fixed);\n\toss.precision(precision);\n\toss << d;\n\n\tistringstream iss(oss.str());\n\tiss >> d;\n\treturn d;\n}\n\nvoid fixDir(string &dir) {\n\tif (SZ(dir) == 0)\n\t\treturn;\n\n\tif (dir[SZ(dir) - 1] != PATH_SEP)\n\t\tdir += PATH_SEP;\n}\n\nstring getFileName(string dir) {\n\tint idx = dir.find_last_of(PATH_SEP);\n\n\tif (idx == -1)\n\t\treturn dir;\n\n\treturn dir.substr(idx + 1);\n}\n\nbool fileExist(string szFilePath, bool print) {\n\tifstream fin(szFilePath.c_str());\n\n\tif (!fin) {\n\t\tif (print)\n\t\t\tprintf(\"fileExist: Failed to open file [%s]\\n\", szFilePath.c_str());\n\t\treturn false;\n\t}\n\tfin.close();\n\treturn true;\n}\n\nstring trim(string str) {\n\tint s = 0, e = SZ(str) - 1;\n\tREP(i, str)\n\t{\n\t\tif (!isspace(str[i]))\n\t\t\tbreak;\n\t\ts++;\n\t}\n\n\tLPD(i, SZ(str)-1, 0)\n\t{\n\t\tif (!isspace(str[i]))\n\t\t\tbreak;\n\t\te--;\n\t}\n\n\tif (s > e)\n\t\treturn \"\";\n\treturn str.substr(s, e - s + 1);\n}\n\nstring toLower(string str) {\n\tstring ret = \"\";\n\tREP(i, str)\n\t\tret += tolower(str[i]);\n\treturn ret;\n}\n\nstring toUpper(string str) {\n\tstring ret = \"\";\n\tREP(i, str)\n\t\tret += toupper(str[i]);\n\treturn ret;\n}\n\nbool startsWith(string str, string pat) {\n\treturn (int) str.find(pat) == 0;\n}\n\nint random(int range) {\n\treturn rand() % range;\n}\n\nchar* toCharArr(string str) {\n\tchar *s = new char[SZ(str) + 1];\n\ts[SZ(str)] = '\\0';\n\tmemcpy(s, str.c_str(), SZ(str));\n\treturn s;\n}\n\nstring toIntStr(string st, int add, bool append_zeros) {\n\tint val = toType(st, 1);\n\tval += add;\n\tstring ret = toString(val);\n\n\tif (append_zeros && ret.size() < st.size())\n\t\tret = string(st.size() - ret.size(), '0') + ret;  //pad zeros\n\treturn ret;\n}\n\nstring removeExt(string name) {\n\tint pos = name.find_last_of('.');\n\n\tif (pos != -1)\n\t\tname = name.substr(0, pos);\n\treturn name;\n}\n\nbool IsPathExist(string path) {\n\treturn boost::filesystem::exists(path);\n}\n\nint CountFileLines(string path)\n{\n\t std::ifstream inFile(path);\n\n\t if(inFile.fail())\n\t {\n\t\t cerr<<\"Couldn't open path: \"<<path<<\"\\n\";\n\n\t\t assert(false);\n\t }\n\n\t int ans = std::count(std::istreambuf_iterator<char>(inFile), std::istreambuf_iterator<char>(), '\\n');\n\n\t inFile.close();\n\n\t return ans;\n}\n\nvector<int> GetPerm(int length, int seed)\n{\n\tboost::mt19937 randGenerator(seed);\n\tboost::uniform_int<> uniform_int_dist;\n\tboost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(randGenerator, uniform_int_dist);\n\n\tvector<int> perm(length);\n\n\tfor (int i = 0; i < (int) perm.size(); ++i)\n\t\tperm[i] = i;\n\n\treturn perm;\n}\n\nstring consumeStringParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, string(\"\"), variable_name);\n}\n\nint consumeIntParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, 1, variable_name);\n}\n\ndouble consumeDoubleParam(int &argc, char** &argv, string variable_name) {\n\treturn consumeParam(argc, argv, 1.0, variable_name);\n}\n\nvector<string> GetDirs(string szRoot) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_directory((itr->status())))\n\t\t\tret.push_back(path_str);\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetDirsNames(string szRoot) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_directory((itr->status())))\n\t\t\tret.push_back(itr->path().filename().c_str());\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFiles(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(path_str);\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFilesExt(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(itr->path().extension().c_str());\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\treturn ret;\n}\n\nvector<string> GetFilesNames(string szRoot, string endwith) {\n\tvector<string> ret;\n\n\tfor (bst_fs::directory_iterator itr(szRoot); itr != bst_fs::directory_iterator(); ++itr) {\n\n\t\tstring path_str = itr->path().c_str();\n\n\t\tif (bst_fs::is_regular_file((itr->status())))\n\t\t{\n\t\t\tif(endwith == \"\" || boost::algorithm::ends_with(path_str, endwith))\n\t\t\t\tret.push_back(itr->path().filename().c_str());\n\t\t}\n\t}\n\n\tsort(ret.begin(), ret.end());\n\n\treturn ret;\n}\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n}\n\n"
  },
  {
    "path": "src/utilities.h",
    "content": "/*\n * general_utilities.h\n *\n *  Created on: 2015-03-11\n *      Author: Moustafa S. Ibrahim\n */\n\n#ifndef GENERAL_UTILITIES_H_\n#define GENERAL_UTILITIES_H_\n\n#include \"custom-macros.h\"\n\n#include <assert.h>\n#include<string>\n#include<vector>\n#include<map>\n#include<set>\n#include<iostream>\n#include<sstream>\n#include<fstream>\n\nusing std::string;\nusing std::ostringstream;\nusing std::istringstream;\nusing std::ifstream;\nusing std::set;\nusing std::map;\nusing std::vector;\nusing std::cout;\nusing std::cerr;\nusing std::pair;\n\nnamespace MostCV {\n\nconst char PATH_SEP = '/';\n\nint dcmp(double x, double y);\n\ndouble round(double d, int precision);\n\nvoid fixDir(string &dir);\n\nbool IsPathExist(string path);\n\nstring getFileName(string dir);\n\nbool fileExist(string szFilePath, bool print = true);\n\nstring trim(string str);\n\nstring toLower(string str);\n\nstring toUpper(string str);\n\nbool startsWith(string str, string pat);\n\nint random(int range);\n\nchar* toCharArr(string str);\n\nstring toIntStr(string st, int add, bool append_zeros = true);\n\nstring removeExt(string name);\n\nmap<string, int> BuildStringIdMap(set<string> classId);\n\nmap<string, int> BuildStringIdMap(vector<string> classesVec);\n\nint UpdateStringIdMap(map<string, int> &items_map, string str);\n\nint CountFileLines(string path);\n\nvector<int> GetPerm(int length, int seed = 123);\n\nstring consumeStringParam(int &argc, char** &argv, string variable_name = \"\");\nint consumeIntParam(int &argc, char** &argv, string variable_name = \"\");\ndouble consumeDoubleParam(int &argc, char** &argv, string variable_name = \"\");\n\nvector<string> GetDirs(string szRoot);\nvector<string> GetDirsNames(string szRoot);\nvector<string> GetFiles(string szRoot, string endwith = \"\");\nvector<string> GetFilesExt(string szRoot, string endwith = \"\");\nvector<string> GetFilesNames(string szRoot, string endwith = \"\");\n\n////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////\n\ntemplate<class Type> Type toType(string data, Type indicator, string variable_name = \"\") {\n  istringstream iss(data);\n\n  Type item;\n  iss >> item;\n\n  if(iss.fail())\n  {\n    if(variable_name != \"\")\n      cerr<<\"Problem in reading variable: \"<<variable_name<<\"\\n\";\n    cerr<<\"Failed to convert string: [\"<<data<<\"] to same type as variable [\"<<indicator<<\"]\\n\";\n    assert(false);\n  }\n\n  return item;\n}\n\ntemplate<class Type> Type consumeParam(int &argc, char** &argv, Type indicator, string variable_name = \"\")\n{\n  assert(argc > 0);\n  string ret = argv[0];\n  --argc, ++argv;\n  return toType(ret, indicator, variable_name);\n}\n\n\n\ntemplate<class Type> char* toCharPtr(Type val) {\n  ostringstream oss;\n  oss << val;\n  return toCharArr(oss.str());\n}\n\ntemplate<class Type> string toString(Type val) {\n  ostringstream oss;\n  oss << val;\n  return oss.str();\n}\n\ntemplate<class Type> vector<Type> readStringItems(string data, Type indicator) {\n  vector<Type> items;\n  Type item;\n\n  istringstream iss(data);\n\n  while (iss >> item)\n    items.push_back(item);\n\n  return items;\n}\n\ntemplate<class Type> vector<Type> readFileItems(string filePath, Type indicator, bool print = true) {\n  vector<Type> items;\n  Type item;\n\n  ifstream fin(filePath.c_str());\n  if (!fin) {\n    if (print)\n      printf(\"\\n\\tWARNING: readFileItems: Failed to open file [%s]\\n\", filePath.c_str());\n    fflush(stdout);\n    return items;\n  }\n\n  while (fin >> item)\n    items.push_back(item);\n\n  fin.close();\n  return items;\n}\n\ntemplate<class Type> vector<Type> readFileItems(ifstream &fin, Type indicator, int length = -1) {\n  Type item;\n  vector<Type> items;\n\n  if(length == -1)\n  {\n\t  while (items.sizefin >> item)\n\t      items.push_back(item);\n  }\n  else\n  {\n\t  items.resize(length);\n\n\t  for (int pos = 0; pos < items.size(); ++pos)\n\t  {\n\t\t  fin >> item;\n\n\t\t  assert(!fin.fail());\n\n\t\t  items[pos] = item;\n\t  }\n  }\n\n\n  return items;\n}\n\ntemplate<class Type> vector<Type> readStreamItems(istringstream &iss, Type indicator, int length = -1) {\n  Type item;\n  vector<Type> items;\n\n  if(length == -1)\n  {\n\t  while (iss >> item)\n\t      items.push_back(item);\n  }\n  else\n  {\n\t  items.resize(length);\n\n\t  for (int pos = 0; pos < items.size(); ++pos)\n\t  {\n\t\t  iss >> item;\n\n\t\t  assert(!iss.fail());\n\n\t\t  items[pos] = item;\n\t  }\n  }\n\n\n  return items;\n}\n\n\ntemplate<class Type> vector<vector<Type> > read2dFileItems(string filePath, Type indicator, bool print = true) {\n  vector<vector<Type> > items;\n\n  ifstream fin(filePath.c_str());\n\n  if (fin.fail()) {\n      printf(\"read2dFileItems: Failed to open file [%s]\\n\", filePath.c_str());\n      assert(false);\n  }\n\n  string line;\n  while (getline(fin, line))\n  {\n\t  if(line != \"\")\n\t\t  items.push_back(readStringItems(line, indicator));\n  }\n\n  return items;\n}\n\n\n\n// For every element that has max frequency, add its position. Total elements equal to # of unqiue elements\n// 2 3 2 2 2 2 4 4    => 0 6 1\ntemplate<class Type> vector<int> getMaxFrequentPositions(vector<Type> &vec) {\n  vector<int> retVec;\n  map<Type, vector<int> > freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]].push_back(i);\n\n  set<pair<int, vector<int> >, std::greater<pair<int, vector<int> > > > freqs;\n\n  for (auto kv : freq_map)\n    freqs.insert(std::make_pair(kv.second.size(), kv.second));\n\n  for (auto group : freqs)\n    retVec.push_back(group.second[0]);\n\n  return retVec;\n}\n\ntemplate<class Type> Type getMaxFrequentLabel(vector<Type> &vec)\n{\n  assert(vec.size() > 0);\n\n  vector<int> pos = getMaxFrequentPositions(vec);\n\n  return vec[ pos[0] ];\n}\n\ntemplate<class Type> map<Type, int> getFrequencyMap(const vector<Type> &vec, bool print = false) {\n  map<Type, int> freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]]++;\n\n  if (print) {\n    for (auto kv : freq_map)\n      cerr << \"Key = \"<<kv.first << \"\\t => Value \" << kv.second << \" instances\\n\";\n  }\n\n  return freq_map;\n}\n\ntemplate<class Type> map<Type, int> getFrequencyMapPercent(vector<Type> &vec, bool print = false) {\n  map<Type, int> freq_map;\n\n  for (int i = 0; i < vec.size(); ++i)\n    freq_map[vec[i]]++;\n\n  if (print) {\n    cerr.precision(1);\n    cerr.setf(std::ios::fixed);\n\n    for (auto kv : freq_map)\n      cerr << \"Key = \"<<kv.first << \"\\t => Value \" << 100.0 * kv.second / (double)vec.size()<< \" %\\n\";\n  }\n\n  return freq_map;\n}\n\ntemplate<class Type1, class Type2> vector<Type2> castVector(const vector<Type1> &row, Type2 indicator) {\n  vector<Type2> ret;\n\n  ret.reserve(row.size());\n\n  for(auto val : row)\n    ret.push_back((Type2)val);\n\n  return ret;\n}\n\ntemplate<class Type1, class Type2> vector<vector<Type2>> cast2DVector(const vector<vector<Type1>> &matrix, Type2 indicator) {\n  vector<vector<Type2>> ret;\n\n  ret.reserve(matrix.size());\n\n  for(auto row : matrix)\n    ret.push_back(castVector(row, indicator));\n\n  return ret;\n}\n\n\n}\n\n#endif /* GENERAL_UTILITIES_H_ */\n"
  },
  {
    "path": "src/volleyball-dataset-mgr.cpp",
    "content": "/*\n * volleyball-dataset-mgr.cpp\n *\n *  Created on: Nov 28, 2015\n *      Author: msibrahi\n */\n\n#include \"volleyball-dataset-mgr.h\"\n\n#include <boost/filesystem.hpp>\nnamespace bst_fs = boost::filesystem;\n\n#include \"utilities.h\"\n#include \"images-utilities.h\"\n\n#include <boost/random/mersenne_twister.hpp>\n#include <boost/random/uniform_int.hpp>\n#include <boost/random/variate_generator.hpp>\n\nnamespace MostCV {\n\nmap<string, int> global_video_id_frame_id_to_activityId;\nmap<string, vector<VolleyballPerson> > global_video_id_frame_id_to_persons;\n\nmap<string, int> persons_actions_ids_map;\nmap<string, int> scene_activities_ids_map;\n\n// statistics\nmap<string, int> scene_activities_freq_map;\nmap<string, int> players_activities_freq_map;\n\nVolleyballVideoData::VolleyballVideoData(string video_id, string video_dir) {\n  MostCV::fixDir(video_dir);\n\n  video_id_ = video_id;\n  video_dir_ = video_dir;\n\n  string annot_file = video_dir + \"annotations.txt\";\n  vector<vector<string> > data2dVec = MostCV::read2dFileItems(annot_file, string(\"\"), false);\n\n  // For every frame, read the players in it\n  for (auto frame_data : data2dVec) {\n    VolleyballPerson person;\n\n    string frame_id = frame_data[0];\n\n    GetFramePath(frame_id); // verify on hard disk\n\n    frame_data.erase(frame_data.begin());\n\n   // if (frame_data[0].find(\"win\") == string::npos)\n   //   continue;\n\n    scene_activities_freq_map[ frame_data[0] ]++;\n\n    int frame_activity_id = MostCV::UpdateStringIdMap(scene_activities_ids_map, frame_data[0]);\n    annot_frame_id_to_activity_id_map_[frame_id] = frame_activity_id;\n    frame_data.erase(frame_data.begin());\n\n    pair<int, int> min_max_persons_y = { 10000, 0 };\n\n    for (int k = 0; k < (int) frame_data.size(); k += 5) {\n      int x = MostCV::toType(frame_data[k + 0], 0);\n      int y = MostCV::toType(frame_data[k + 1], 0);\n      int w = MostCV::toType(frame_data[k + 2], 0);\n      int h = MostCV::toType(frame_data[k + 3], 0);\n      string activity_str = frame_data[k + 4];\n\n      players_activities_freq_map[activity_str]++;\n\n      min_max_persons_y.first = std::min(min_max_persons_y.first, y);\n      min_max_persons_y.second = std::max(min_max_persons_y.second, y + h);\n\n      person.bbox_ = RectHelper(Rect(x, y, w, h));\n      person.action_id_ = MostCV::UpdateStringIdMap(persons_actions_ids_map, activity_str);\n\n      annot_frame_id_persons_map_[frame_id].push_back(person);\n    }\n\n\n\n    if (min_max_persons_y.first < 0)\n      min_max_persons_y.first = 0;\n\n    annot_frame_id_to_min_max_persons_y_map_[frame_id] = min_max_persons_y;\n    annot_frame_id_vec_.push_back(frame_id);\n\n    string video_id_frame_id = video_id + \"#\"+frame_id;\n\n    global_video_id_frame_id_to_activityId[video_id_frame_id] = frame_activity_id;\n    global_video_id_frame_id_to_persons[video_id_frame_id] = annot_frame_id_persons_map_[frame_id];\n\n    if (annot_frame_id_persons_map_[frame_id].size() < 7)\n    {\n    \tcerr<<\"video \"<<video_id_frame_id<<\" frame id \"<<frame_id\n    \t    <<\" has \"<<annot_frame_id_persons_map_[frame_id].size()<<\" persons\\n\";\n    }\n  }\n\n  SortPersonsPerFrames();\n\n  cerr << video_id_ << \" is processed\\n\";\n}\n\nvoid VolleyballVideoData::SortPersonsPerFrames() {\n  // Sorting the persons based on top left point: x first, if tie, y first. Kind of left-to-right sweeping\n  for (auto &frame_persons_kv : annot_frame_id_persons_map_) {\n    vector<VolleyballPerson> &persons = frame_persons_kv.second;\n\n    sort(persons.begin(), persons.end(), [](const VolleyballPerson &a, const VolleyballPerson &b)\n    {\n      if(a.bbox_.r.x != b.bbox_.r.x)\n      return a.bbox_.r.x < b.bbox_.r.x;\n      return a.bbox_.r.y < b.bbox_.r.y;\n    });\n  }\n}\n\nvoid VolleyballVideoData::ResetPersons(string img_name, vector<RectHelper> rects) {\n  annot_frame_id_persons_map_[img_name].clear();\n\n  for (auto rect : rects) {\n    VolleyballPerson person;\n\n    person.bbox_ = rect;\n    person.action_id_ = 0;\n\n    annot_frame_id_persons_map_[img_name].push_back(person);\n  }\n}\n\nvector<RectHelper> VolleyballVideoData::GetPersonsRect(string frame_id) {\n  vector<RectHelper> rects;\n\n  for (auto person : annot_frame_id_persons_map_[frame_id])\n    rects.push_back(person.bbox_);\n\n  return rects;\n}\n\n// Short Util\nstring VolleyballVideoData::GetFramePath(string frame_id, int shift) {\n  string frame_id_no_ext = frame_id.substr(0, frame_id.find_first_of('.'));\n  string ext = frame_id.substr(frame_id.find_first_of('.'));\n  string target_frame_id = MostCV::toIntStr(frame_id_no_ext, shift, false);\n  string frame_new_path = video_dir_ + frame_id_no_ext + MostCV::PATH_SEP + target_frame_id + ext;\n\n  assert(boost::filesystem::exists(frame_new_path));\n\n  return frame_new_path;\n}\n\npair<vector<string>, vector<string> > VolleyballVideoData::GetTemporalWindowPaths(string frame_id, int temporal_window, int step, bool is_use_expend_factor) {\n  vector<string> window_frames_after;\n  vector<string> window_frames_before;\n\n  if (is_use_expend_factor)\n    temporal_window = 2 * temporal_window + 1;\n\n  LP(w, 1+temporal_window/2)\n  {\n    string path = GetFramePath(frame_id, -w * step);\n    window_frames_before.push_back(path);\n  }\n\n  LP(w, (temporal_window+1)/2)\n  {\n    string path = GetFramePath(frame_id, w * step);\n    window_frames_after.push_back(path);\n  }\n\n  return {window_frames_before, window_frames_after};\n}\n\nvector<string> VolleyballVideoData::GetTemporalWindowPathsMerged(string frame_id, int temporal_window, int step) {\n  vector<string> paths;\n\n  int start = -temporal_window/2;\n\n  LP(w, temporal_window)\n  {\n    string path = GetFramePath(frame_id, start * step);\n    paths.push_back(path);\n    ++start;\n  }\n  return paths;\n}\n\nvoid VolleyballVideoData::visualize()\n{\n  for (auto frame_id : annot_frame_id_vec_)\n  {\n    string path = GetFramePath(frame_id);\n    Mat img = cv::imread(path);\n\n    cerr<<video_id_<<\" \"<<frame_id<<\" \"<<path<<\"\\n\";\n    RectHelper::DrawRects(img, GetPersonsRect(frame_id));\n  }\n}\n\n\n\n\n\n\n\n\n\n//---------------------------------------------------------------\n\nVolleyballDatasetPart::VolleyballDatasetPart(string dataset_name, string config_file, string videos_root_dir) {\n\n  cerr << \"Preparing Dataset: \" << dataset_name << \"\\n\\tfrom config file: \" << config_file << \"\\n\";\n\n  assert(MostCV::IsPathExist(config_file));\n\n  MostCV::fixDir(videos_root_dir);\n  ids_ = MostCV::readFileItems(config_file, string(\"\"), false);\n  dataset_name_ = dataset_name;\n\n  for (auto video_seq : ids_)\n\t  videos_vec_.push_back(VolleyballVideoData(video_seq, videos_root_dir + video_seq));\n\n  cerr << \"\\n\\n************************\\n\\n\";\n}\n\nvoid VolleyballDatasetPart::ReorderVideos(vector<string> video_ids) {\n  for (int i = 0; i < (int) video_ids.size(); ++i) {\n    for (int j = 0; j < (int) ids_.size(); ++j) {\n      if (video_ids[i] != ids_[j])\n        continue;\n      std::swap(ids_[i], ids_[j]);\n      std::swap(videos_vec_[i], videos_vec_[j]);\n    }\n  }\n}\n\nvector<pair<VolleyballVideoData, int> > VolleyballDatasetPart::GetVideoFrameList(bool is_shuffled, int subset_percent) {\n  vector<pair<VolleyballVideoData, int> > database_shuffled;\n\n  return database_shuffled;\n\n  boost::mt19937 generator(100);\n  boost::uniform_int<> uni_dist;\n  boost::variate_generator<boost::mt19937&, boost::uniform_int<> > rand_generator(generator, uni_dist);\n\n  vector<int> labels;\n\n  for (auto video : videos_vec_) {\n    int frame_pos = -1;\n\n    for (auto frame_id : video.annot_frame_id_vec_) {\n      ++frame_pos;\n\n      database_shuffled.push_back(std::make_pair(video, frame_pos));\n    }\n  }\n\n  if (is_shuffled) {\n    cerr << \"Before: Total Shuffled Elements: \" << database_shuffled.size() << \" with 1st video\" << database_shuffled.begin()->first.video_id_ << \"\\n\";\n\n    std::random_shuffle(database_shuffled.begin(), database_shuffled.end(), rand_generator);\n\n    cerr << \"After: Total Shuffled Elements: \" << database_shuffled.size() << \" with 1st video\" << database_shuffled.begin()->first.video_id_ << \"\\n\";\n  }\n\n  int max_size = subset_percent * database_shuffled.size();\n  database_shuffled.resize(max_size);\n\n  return database_shuffled;\n}\n\nvoid VolleyballDatasetPart::visualize()\n{\n  for (auto video : videos_vec_)\n    video.visualize();\n}\n\n\n\n\n\n\n\n\n//---------------------------------------------------------------\n\nVolleyballDatasetMgr::VolleyballDatasetMgr(string config_dir_path, string videos_root_dir) {\n  MostCV::fixDir(config_dir_path);\n\n  dataset_division_.push_back(VolleyballDatasetPart(\"train\", config_dir_path + \"train.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"val\", config_dir_path + \"val.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"test\", config_dir_path + \"test.txt\", videos_root_dir));\n  dataset_division_.push_back(VolleyballDatasetPart(\"trainval\", config_dir_path + \"trainval.txt\", videos_root_dir));\n\n  total_videos_ = 0;\n  total_frames_ = 0;\n\n  // Remove empty datasets\n  for (int i = 0; i < (int) dataset_division_.size(); ++i) {\n    if (dataset_division_[i].videos_vec_.size() == 0) {\n      cerr << dataset_division_[i].dataset_name_ << \" dataset is EMPTY\\n\";\n\n      dataset_division_.erase(dataset_division_.begin() + i);\n      --i;\n    }\n  }\n\n  assert(dataset_division_.size() > 0);\n\n  for (auto dataset : dataset_division_) {\n    int current_fames = 0;\n\n    for (auto video : dataset.videos_vec_) {\n      total_frames_ += video.annot_frame_id_vec_.size();\n      current_fames += video.annot_frame_id_vec_.size();\n    }\n    cerr << \"Total frames for dataset \" << dataset.dataset_name_ << \" = \" << current_fames << \"\\n\";\n\n    total_videos_ += dataset.videos_vec_.size();\n  }\n\n  total_scene_labels = scene_activities_ids_map.size();\n  total_persons_labels = persons_actions_ids_map.size();\n\n  cerr << \"\\nTotal videos = \" << total_videos_ << \" - total frames = \" << total_frames_ << \"\\n\";\n\n  cerr << \"\\nScenes Labels:\\n\";\n  for (auto scene_kv : scene_activities_ids_map)\n    cerr << \"\\t\" << scene_kv.first << \" \" << scene_kv.second << \"\\n\";\n\n  cerr << \"\\nPersons Labels:\\n\";\n  for (auto persons_kv : persons_actions_ids_map)\n    cerr << \"\\t\" << persons_kv.first << \" \" << persons_kv.second << \"\\n\";\n\n  cerr << \"\\nScenes Labels frequency:\\n\";\n  for (auto entry : scene_activities_freq_map)\n    cerr << \"\\t\" << entry.first << \" \" << entry.second << \"\\n\";\n\n  cerr << \"\\nPlayers Labels frequency:\\n\";\n  for (auto entry : players_activities_freq_map)\n    cerr << \"\\t\" << entry.first << \" \" << entry.second << \"\\n\";\n}\n\nint VolleyballDatasetMgr::GetActivityId(string video_id, string frame_id)\n{\n  string video_id_frame_id = video_id + \"#\"+frame_id;\n\n if (global_video_id_frame_id_to_activityId.count(video_id_frame_id) == 0)\n {\n   cerr<<\"problem with \"<<video_id_frame_id<<\"\\n\\n\";\n   return -1;\n }\n\n  assert( global_video_id_frame_id_to_activityId.count(video_id_frame_id) );\n\n  return global_video_id_frame_id_to_activityId[video_id_frame_id];\n}\n\nvector<VolleyballPerson> VolleyballDatasetMgr::GetPersons(string video_id, string frame_id)\n{\n  string video_id_frame_id = video_id + \"#\"+frame_id;\n\n  assert( global_video_id_frame_id_to_persons.count(video_id_frame_id) );\n\n  return global_video_id_frame_id_to_persons[video_id_frame_id];\n\n}\n\n\n// verify 2*w+1 elements..e.g. centered around every frame\nvoid VolleyballDatasetMgr::VerifyDataAvailbility(int temporal_window)\n{\n  for (auto dataset : dataset_division_) {\n    cerr<<\"Verifying dataset: \"<<dataset.dataset_db_name_<<\"\\n\";\n    for (auto video : dataset.videos_vec_) {\n      for (auto frame_id : video.annot_frame_id_vec_) {\n        video.GetTemporalWindowPaths(frame_id, temporal_window, 1, true);\n      }\n    }\n  }\n}\n\n\n}\n"
  },
  {
    "path": "src/volleyball-dataset-mgr.h",
    "content": "/*\n * coactivity-dataset-mgr.h\n *\n *  Created on: Nov 28, 2015\n *      Author: msibrahi\n */\n\n#ifndef VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_\n#define VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_\n\n#include <string>\n#include <vector>\n#include <set>\nusing std::vector;\nusing std::set;\nusing std::string;\nusing std::pair;\n\n#include \"opencv2/core/core.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include \"opencv2/imgproc/imgproc.hpp\"\nusing cv::Mat;\nusing cv::Size;\nusing cv::Ptr;\n\n#include \"rect-helper.h\"\n\nnamespace MostCV {\n\n\nclass VolleyballPerson {\n public:\n  RectHelper  bbox_;\n  int action_id_;\n};\n\nclass VolleyballVideoData {\n public:\n  VolleyballVideoData() {}\n  VolleyballVideoData(string video_id, string video_dir);\n\n  string GetFramePath(string frame_id, int shift = 0);\n  pair< vector<string>, vector<string> > GetTemporalWindowPaths(string frame_id, int temporal_window, int step = 1, bool is_use_expend_factor = true);\n  vector<string> GetTemporalWindowPathsMerged(string frame_id, int temporal_window, int step = 1);\n  void ResetPersons(string frame_id, vector<RectHelper> rects);\n  vector<RectHelper> GetPersonsRect(string frame_id);\n  void SortPersonsPerFrames();\n  void visualize();\n\n\n  string video_id_;\n  string video_dir_;\n\n  vector<string>  annot_frame_id_vec_;\n  map<string, int> annot_frame_id_to_activity_id_map_;\n  map<string, pair<int, int>> annot_frame_id_to_min_max_persons_y_map_;\n\n  map<string, vector<VolleyballPerson> > annot_frame_id_persons_map_;\n};\n\nclass VolleyballDatasetPart {\n public:\n  VolleyballDatasetPart() {}\n  VolleyballDatasetPart(string dataset_name, string config_file, string videos_root_dir);\n  void ReorderVideos(vector<string> video_ids);\n  vector<pair<VolleyballVideoData, int> > GetVideoFrameList(bool is_shuffled, int subset_percent);\n  void visualize();\n\n  vector<string> ids_;\n  vector<VolleyballVideoData> videos_vec_;\n  string dataset_name_;\n\n  string dataset_db_name_;\n  string dataset_db_path_;\n\n};\n\nclass VolleyballDatasetMgr {\n public:\n  VolleyballDatasetMgr(string config_dir_path, string videos_root_dir);\n\n  void VerifyDataAvailbility(int temporal_window);\n\n  int GetActivityId(string video_id, string frame_id);\n  vector<VolleyballPerson> GetPersons(string video_id, string frame_id);\n\n  vector<VolleyballDatasetPart> dataset_division_;\n  int total_videos_;\n  int total_frames_;\n  int total_scene_labels;\n  int total_persons_labels;\n};\n\n}\n\n#endif /* VOLLEYBALL_DATASET_MGR_H_FINAL_DATASET_ */\n"
  },
  {
    "path": "volleyball-simple/39/annotations.txt",
    "content": "29885.jpg r_spike 430 575 82 170 waiting 563 491 82 177 waiting 585 645 94 190 digging 990 452 54 168 standing 1003 501 50 240 blocking 957 662 75 189 standing 1203 415 94 150 moving 1137 490 93 180 standing 1135 488 108 218 spiking 1272 718 121 187 moving 1402 539 96 168 moving 1648 535 70 167 standing\n29905.jpg l-pass 619 582 86 131 falling 784 581 122 205 falling 740 505 107 139 waiting 1099 609 103 207 standing 1149 455 58 144 standing 1315 678 87 198 standing 1268 401 72 166 standing 1175 576 54 193 standing 1231 543 68 190 standing 1336 485 92 172 standing 1389 549 84 182 standing 1723 556 60 175 standing\n"
  },
  {
    "path": "volleyball-simple/41/annotations.txt",
    "content": "19515.jpg l-pass 1487 607 61 146 standing 1399 685 48 157 standing 1367 555 78 128 standing 1031 641 61 158 standing 1022 584 42 151 standing 999 556 63 133 standing 901 594 55 154 standing 722 577 51 128 standing 649 615 60 155 digging 573 669 96 124 standing 510 537 60 111 moving 694 493 55 110 standing\n19560.jpg r_spike 1383 574 65 133 standing 1359 610 53 140 standing 1063 611 59 158 standing 1101 565 64 155 standing 1117 547 48 133 standing 1123 522 53 119 standing 899 579 53 154 standing 800 634 53 171 standing 726 550 52 138 standing 760 496 64 137 moving 560 496 42 179 setting 502 616 61 153 standing\n"
  }
]