[
  {
    "path": ".gitignore",
    "content": "sqlgen\n*.sqlite\n*.txt\n_docs\n"
  },
  {
    "path": "LICENSE",
    "content": "Copyright (c) 2015, drone.io\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n  list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice,\n  this list of conditions and the following disclaimer in the documentation\n  and/or other materials provided with the distribution.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n"
  },
  {
    "path": "README.md",
    "content": "**sqlgen** generates SQL statements and database helper functions from your Go structs. It can be used in place of a simple ORM or hand-written SQL. See the [demo](https://github.com/drone/sqlgen/tree/master/demo) directory for examples.\n\n### Install\n\nInstall or upgrade with this command:\n\n```\ngo get -u github.com/drone/sqlgen\n```\n\n### Usage\n\n```\nUsage of sqlgen:\n  -type string\n    \ttype to generate; required\n  -file string\n    \tinput file name; required\n  -o string\n    \toutput file name\n  -pkg string\n    \toutput package name\n  -db string\n    \tsql dialect; sqlite, postgres, mysql\n  -schema\n    \tgenerate sql schema and queries; default true\n  -funcs\n    \tgenerate sql helper functions; default true\n```\n\n### Tutorial\n\nFirst, let's start with a simple `User` struct in `user.go`:\n\n```Go\ntype User struct {\n\tID     int64\n\tLogin  string\n\tEmail  string\n}\n```\n\nWe can run the following command:\n\n```\nsqlgen -file user.go -type User -pkg demo\n```\n\nThe tool outputs the following generated code:\n\n```Go\nfunc ScanUser(row *sql.Row) (*User, error) {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\n\terr := row.Scan(\n\t\t&v0,\n\t\t&v1,\n\t\t&v2,\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &User{}\n\tv.ID = v0\n\tv.Login = v1\n\tv.Email = v2\n\n\treturn v, nil\n}\n\nconst CreateUserStmt = `\nCREATE TABLE IF NOT EXISTS users (\n user_id     INTEGER\n,user_login  TEXT\n,user_email  TEXT\n);\n`\n\nconst SelectUserStmt = `\nSELECT \n user_id\n,user_login\n,user_email\nFROM users \n`\n\nconst SelectUserRangeStmt = `\nSELECT \n user_id\n,user_login\n,user_email\nFROM users \nLIMIT ? OFFSET ?\n`\n\n\n// more functions and sql statements not displayed\n```\n\nThis is a great start, but what if we want to specify primary keys, column sizes and more? This may be acheived by annotating your code using Go tags. For example, we can tag the `ID` field to indicate it is a primary key and will auto increment:\n\n```diff\ntype User struct {\n-   ID      int64\n+   ID      int64  `sql:\"pk: true, auto: true\"`\n    Login   string\n    Email   string\n}\n```\n\nThis information allows the tool to generate smarter SQL statements:\n\n```diff\nCREATE TABLE IF NOT EXISTS users (\n-user_id     INTEGER\n+user_id     INTEGER PRIMARY KEY AUTOINCREMENT\n,user_login  TEXT\n,user_email  TEXT\n);\n```\n\nIncluding SQL statements to select, insert, update and delete data using the primary key:\n\n```Go\nconst SelectUserPkeyStmt = `\nSELECT \n user_id\n,user_login\n,user_email\nWHERE user_id=?\n`\n\nconst UpdateUserPkeyStmt = `\nUPDATE users SET \n user_id=?\n,user_login=?\n,user_email=?\nWHERE user_id=?\n`\n\nconst DeleteUserPkeyStmt = `\nDELETE FROM users \nWHERE user_id=?\n`\n```\n\nWe can take this one step further and annotate indexes. In our example, we probably want to make sure the `user_login` field has a unique index:\n\n```diff\ntype User struct {\n    ID      int64  `sql:\"pk: true, auto: true\"`\n-   Login   string\n+   Login   string `sql:\"unique: user_login\"`\n    Email   string\n}\n```\n\nThis information instructs the tool to generate the following:\n\n\n```Go\nconst CreateUserLogin = `\nCREATE UNIQUE INDEX IF NOT EXISTS user_login ON users (user_login)\n```\n\nThe tool also assumes that we probably intend to fetch data from the database using this index. The tool will therefore automatically generate the following queries:\n\n```Go\nconst SelectUserLoginStmt = `\nSELECT \n user_id\n,user_login\n,user_email\nWHERE user_login=?\n`\n\nconst UpdateUserLoginStmt = `\nUPDATE users SET \n user_id=?\n,user_login=?\n,user_email=?\nWHERE user_login=?\n`\n\nconst DeleteUserLoginStmt = `\nDELETE FROM users \nWHERE user_login=?\n`\n```\n\n### Nesting\n\nNested Go structures can be flattened into a single database table. As an example, we have a `User` and `Address` with a one-to-one relationship. In some cases, we may prefer to de-normalize our data and store in a single table, avoiding un-necessary joins.\n\n```diff\ntype User struct {\n    ID     int64  `sql:\"pk: true\"`\n    Login  string\n    Email  string\n+   Addr   *Address\n}\n\ntype Address struct {\n    City   string\n    State  string\n    Zip    string `sql:\"index: user_zip\"`\n}\n```\n\nThe above relationship is flattened into a single table (see below). When the data is retrieved from the database the nested structure is restored.\n\n```sql\nCREATE TALBE IF NOT EXISTS users (\n user_id         INTEGER PRIMARY KEY AUTO_INCREMENT\n,user_login      TEXT\n,user_email      TEXT\n,user_addr_city  TEXT\n,user_addr_state TEXT\n,user_addr_zip   TEXT\n);\n```\n\n### JSON Encoding\n\nSome types in your struct may not have native equivalents in your database such as `[]string`. These values can be marshaled and stored as JSON in the database.\n\n```diff\ntype User struct {\n    ID     int64  `sql:\"pk: true\"`\n    Login  string\n    Email  string\n+   Label  []string `sql:\"encode: json\"\n}\n```\n\n### Dialects\n\nYou may specify one of the following SQL dialects when generating your code: `postgres`, `mysql` and `sqlite`. The default value is `sqlite`.\n\n```\nsqlgen -file user.go -type User -pkg demo -db postgres\n```\n\n\n### Go Generate\n\nExample use with `go:generate`:\n\n```Go\npackage demo\n\n//go:generate sqlgen -file user.go -type User -pkg demo -o user_sql.go\n\ntype User struct {\n    ID     int64  `sql:\"pk: true, auto: true\"`\n    Login  string `sql:\"unique: user_login\"`\n    Email  string `sql:\"size: 1024\"`\n    Avatar string\n}\n```\n\n### Benchmarks\n\nThis tool demonstrates performance gains, albeit small, over light-weight ORM packages such as `sqlx` and `meddler`. Over time I plan to expand the benchmarks to include additional ORM packages.\n\nTo run the project benchmarks:\n\n```\ngo get ./...\ngo generate ./...\ngo build\ncd bench\ngo test -bench=Bench\n```\n\nExample selecing a single row:\n\n```\nBenchmarkMeddlerRow-4      30000        42773 ns/op\nBenchmarkSqlxRow-4         30000        41554 ns/op\nBenchmarkSqlgenRow-4       50000        39664 ns/op\n\n```\n\nSelecting multiple rows:\n\n```\nBenchmarkMeddlerRows-4      2000      1025218 ns/op\nBenchmarkSqlxRows-4         2000       807213 ns/op\nBenchmarkSqlgenRows-4       2000       700673 ns/op\n```\n\n\n#### Credits\n\nThis tool was inspired by [scaneo](https://github.com/variadico/scaneo).\n"
  },
  {
    "path": "bench/type.go",
    "content": "package bench\n\n//go:generate ../sqlgen -file type.go -type User -pkg bench -o type_sql.go\n\ntype User struct {\n\tID      int64  `sql:\"pk: true, auto: true\"   meddler:\"user_id,pk\"   db:\"user_id\"`\n\tName    string `sql:\"unique: user_name\"      meddler:\"user_name\"    db:\"user_name\"`\n\tPass    string `sql:\"\"                       meddler:\"user_pass\"    db:\"user_pass\"`\n\tEmail   string `sql:\"unique: user_email\"     meddler:\"user_email\"   db:\"user_email\"`\n\tActive  bool   `sql:\"index:  user_active\"    meddler:\"user_active\"  db:\"user_active\"`\n\tCreated int64  `sql:\"\"                       meddler:\"user_created\" db:\"user_created\"`\n\tUpdated int64  `sql:\"\"                       meddler:\"user_updated\" db:\"user_updated\"`\n}\n"
  },
  {
    "path": "bench/type_sql.go",
    "content": "package bench\n\n// THIS FILE WAS AUTO-GENERATED. DO NOT MODIFY.\n\nimport (\n\t\"database/sql\"\n)\n\nfunc ScanUser(row *sql.Row) (*User, error) {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 int64\n\tvar v6 int64\n\n\terr := row.Scan(\n\t\t&v0,\n\t\t&v1,\n\t\t&v2,\n\t\t&v3,\n\t\t&v4,\n\t\t&v5,\n\t\t&v6,\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &User{}\n\tv.ID = v0\n\tv.Name = v1\n\tv.Pass = v2\n\tv.Email = v3\n\tv.Active = v4\n\tv.Created = v5\n\tv.Updated = v6\n\n\treturn v, nil\n}\n\nfunc ScanUsers(rows *sql.Rows) ([]*User, error) {\n\tvar err error\n\tvar vv []*User\n\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 int64\n\tvar v6 int64\n\n\tfor rows.Next() {\n\t\terr = rows.Scan(\n\t\t\t&v0,\n\t\t\t&v1,\n\t\t\t&v2,\n\t\t\t&v3,\n\t\t\t&v4,\n\t\t\t&v5,\n\t\t\t&v6,\n\t\t)\n\t\tif err != nil {\n\t\t\treturn vv, err\n\t\t}\n\n\t\tv := &User{}\n\t\tv.ID = v0\n\t\tv.Name = v1\n\t\tv.Pass = v2\n\t\tv.Email = v3\n\t\tv.Active = v4\n\t\tv.Created = v5\n\t\tv.Updated = v6\n\n\t\tvv = append(vv, v)\n\t}\n\treturn vv, rows.Err()\n}\n\nfunc SliceUser(v *User) []interface{} {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 int64\n\tvar v6 int64\n\n\tv0 = v.ID\n\tv1 = v.Name\n\tv2 = v.Pass\n\tv3 = v.Email\n\tv4 = v.Active\n\tv5 = v.Created\n\tv6 = v.Updated\n\n\treturn []interface{}{\n\t\tv0,\n\t\tv1,\n\t\tv2,\n\t\tv3,\n\t\tv4,\n\t\tv5,\n\t\tv6,\n\t}\n}\n\nfunc SelectUser(db *sql.DB, query string, args ...interface{}) (*User, error) {\n\trow := db.QueryRow(query, args...)\n\treturn ScanUser(row)\n}\n\nfunc SelectUsers(db *sql.DB, query string, args ...interface{}) ([]*User, error) {\n\trows, err := db.Query(query, args...)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer rows.Close()\n\treturn ScanUsers(rows)\n}\n\nfunc InsertUser(db *sql.DB, query string, v *User) error {\n\n\tres, err := db.Exec(query, SliceUser(v)[1:]...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tv.ID, err = res.LastInsertId()\n\treturn err\n}\n\nfunc UpdateUser(db *sql.DB, query string, v *User) error {\n\n\targs := SliceUser(v)[1:]\n\targs = append(args, v.ID)\n\t_, err := db.Exec(query, args...)\n\treturn err\n}\n\nconst CreateUserStmt = `\nCREATE TABLE IF NOT EXISTS users (\n user_id      INTEGER PRIMARY KEY AUTOINCREMENT\n,user_name    TEXT\n,user_pass    TEXT\n,user_email   TEXT\n,user_active  BOOLEAN\n,user_created INTEGER\n,user_updated INTEGER\n);\n`\n\nconst InsertUserStmt = `\nINSERT INTO users (\n user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\n) VALUES (?,?,?,?,?,?)\n`\n\nconst SelectUserStmt = `\nSELECT \n user_id\n,user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\nFROM users \n`\n\nconst SelectUserRangeStmt = `\nSELECT \n user_id\n,user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\nFROM users \nLIMIT ? OFFSET ?\n`\n\nconst SelectUserCountStmt = `\nSELECT count(1)\nFROM users \n`\n\nconst SelectUserPkeyStmt = `\nSELECT \n user_id\n,user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\nFROM users \nWHERE user_id=?\n`\n\nconst UpdateUserPkeyStmt = `\nUPDATE users SET \n user_id=?\n,user_name=?\n,user_pass=?\n,user_email=?\n,user_active=?\n,user_created=?\n,user_updated=? \nWHERE user_id=?\n`\n\nconst DeleteUserPkeyStmt = `\nDELETE FROM users \nWHERE user_id=?\n`\n\nconst CreateUserNameStmt = `\nCREATE UNIQUE INDEX IF NOT EXISTS user_name ON users (user_name)\n`\n\nconst SelectUserNameStmt = `\nSELECT \n user_id\n,user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\nFROM users \nWHERE user_name=?\n`\n\nconst UpdateUserNameStmt = `\nUPDATE users SET \n user_id=?\n,user_name=?\n,user_pass=?\n,user_email=?\n,user_active=?\n,user_created=?\n,user_updated=? \nWHERE user_name=?\n`\n\nconst DeleteUserNameStmt = `\nDELETE FROM users \nWHERE user_name=?\n`\n\nconst CreateUserEmailStmt = `\nCREATE UNIQUE INDEX IF NOT EXISTS user_email ON users (user_email)\n`\n\nconst SelectUserEmailStmt = `\nSELECT \n user_id\n,user_name\n,user_pass\n,user_email\n,user_active\n,user_created\n,user_updated\nFROM users \nWHERE user_email=?\n`\n\nconst UpdateUserEmailStmt = `\nUPDATE users SET \n user_id=?\n,user_name=?\n,user_pass=?\n,user_email=?\n,user_active=?\n,user_created=?\n,user_updated=? \nWHERE user_email=?\n`\n\nconst DeleteUserEmailStmt = `\nDELETE FROM users \nWHERE user_email=?\n`\n"
  },
  {
    "path": "bench/type_test.go",
    "content": "package bench\n\nimport (\n\t\"database/sql\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/Pallinder/go-randomdata\"\n\t\"github.com/jmoiron/sqlx\"\n\t_ \"github.com/mattn/go-sqlite3\"\n\t\"github.com/russross/meddler\"\n)\n\nvar db *sql.DB\nvar dbx *sqlx.DB\n\nfunc init() {\n\tvar err error\n\tdb, err = sql.Open(\"sqlite3\", \":memory:\")\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\tdb.Exec(\"DROP TABLE users;\")\n\tdbx = sqlx.NewDb(db, \"sqlite3\")\n\n\tddl := []string{CreateUserStmt}\n\tfor _, stmt := range ddl {\n\t\t_, err = db.Exec(stmt)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\n\tfor i := 0; i < 100; i++ {\n\t\tuser := &User{}\n\t\tuser.Name = randomdata.FullName(randomdata.RandomGender)\n\t\tuser.Email = randomdata.Email()\n\t\tuser.Pass = \"pa55word\"\n\t\tuser.Created = time.Now().Unix()\n\t\tuser.Updated = time.Now().Unix()\n\n\t\terr := InsertUser(db, InsertUserStmt, user)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n}\n\nvar result *User\nvar results []*User\n\nfunc BenchmarkMeddlerRow(b *testing.B) {\n\tvar user *User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\tuser = &User{}\n\t\terr = meddler.QueryRow(db, user, SelectUserPkeyStmt, 1)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresult = user\n}\n\nfunc BenchmarkMeddlerRows(b *testing.B) {\n\tvar users []*User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\terr = meddler.QueryAll(db, &users, SelectUserStmt)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresults = users\n}\n\nfunc BenchmarkSqlxRow(b *testing.B) {\n\tvar user *User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\tuser = &User{}\n\t\terr = dbx.Get(user, SelectUserPkeyStmt, 1)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresult = user\n}\n\nfunc BenchmarkSqlxRows(b *testing.B) {\n\tvar users []*User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\terr = dbx.Select(&users, SelectUserStmt)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresults = users\n}\n\nfunc BenchmarkSqlgenRow(b *testing.B) {\n\tvar user *User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\tuser, err = SelectUser(db, SelectUserPkeyStmt, 1)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresult = user\n}\n\nfunc BenchmarkSqlgenRows(b *testing.B) {\n\tvar users []*User\n\tvar err error\n\n\tfor n := 0; n < b.N; n++ {\n\t\tusers, err = SelectUsers(db, SelectUserStmt)\n\t\tif err != nil {\n\t\t\tpanic(err)\n\t\t}\n\t}\n\tresults = users\n}\n"
  },
  {
    "path": "demo/hook.go",
    "content": "package demo\n\n//go:generate ../sqlgen -file hook.go -type Hook -pkg demo -o hook_sql.go -db mysql\n\ntype Hook struct {\n\tID         int64 `sql:\"pk: true, auto: true\"`\n\tSha        string\n\tAfter      string\n\tBefore     string\n\tCreated    bool\n\tDeleted    bool\n\tForced     bool\n\tHeadCommit *Commit `sql:\"name: head\"`\n}\n\ntype Commit struct {\n\tID        string\n\tMessage   string\n\tTimestamp string\n\tAuthor    *Author\n\tCommitter *Author\n}\n\ntype Author struct {\n\tName     string\n\tEmail    string\n\tUsername string\n}\n"
  },
  {
    "path": "demo/hook_sql.go",
    "content": "package demo\n\n// THIS FILE WAS AUTO-GENERATED. DO NOT MODIFY.\n\nimport (\n\t\"database/sql\"\n)\n\nfunc ScanHook(row *sql.Row) (*Hook, error) {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 bool\n\tvar v7 string\n\tvar v8 string\n\tvar v9 string\n\tvar v10 string\n\tvar v11 string\n\tvar v12 string\n\tvar v13 string\n\tvar v14 string\n\tvar v15 string\n\n\terr := row.Scan(\n\t\t&v0,\n\t\t&v1,\n\t\t&v2,\n\t\t&v3,\n\t\t&v4,\n\t\t&v5,\n\t\t&v6,\n\t\t&v7,\n\t\t&v8,\n\t\t&v9,\n\t\t&v10,\n\t\t&v11,\n\t\t&v12,\n\t\t&v13,\n\t\t&v14,\n\t\t&v15,\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &Hook{}\n\tv.ID = v0\n\tv.Sha = v1\n\tv.After = v2\n\tv.Before = v3\n\tv.Created = v4\n\tv.Deleted = v5\n\tv.Forced = v6\n\tv.HeadCommit = &Commit{}\n\tv.HeadCommit.ID = v7\n\tv.HeadCommit.Message = v8\n\tv.HeadCommit.Timestamp = v9\n\tv.HeadCommit.Author = &Author{}\n\tv.HeadCommit.Author.Name = v10\n\tv.HeadCommit.Author.Email = v11\n\tv.HeadCommit.Author.Username = v12\n\tv.HeadCommit.Committer = &Author{}\n\tv.HeadCommit.Committer.Name = v13\n\tv.HeadCommit.Committer.Email = v14\n\tv.HeadCommit.Committer.Username = v15\n\n\treturn v, nil\n}\n\nfunc ScanHooks(rows *sql.Rows) ([]*Hook, error) {\n\tvar err error\n\tvar vv []*Hook\n\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 bool\n\tvar v7 string\n\tvar v8 string\n\tvar v9 string\n\tvar v10 string\n\tvar v11 string\n\tvar v12 string\n\tvar v13 string\n\tvar v14 string\n\tvar v15 string\n\n\tfor rows.Next() {\n\t\terr = rows.Scan(\n\t\t\t&v0,\n\t\t\t&v1,\n\t\t\t&v2,\n\t\t\t&v3,\n\t\t\t&v4,\n\t\t\t&v5,\n\t\t\t&v6,\n\t\t\t&v7,\n\t\t\t&v8,\n\t\t\t&v9,\n\t\t\t&v10,\n\t\t\t&v11,\n\t\t\t&v12,\n\t\t\t&v13,\n\t\t\t&v14,\n\t\t\t&v15,\n\t\t)\n\t\tif err != nil {\n\t\t\treturn vv, err\n\t\t}\n\n\t\tv := &Hook{}\n\t\tv.ID = v0\n\t\tv.Sha = v1\n\t\tv.After = v2\n\t\tv.Before = v3\n\t\tv.Created = v4\n\t\tv.Deleted = v5\n\t\tv.Forced = v6\n\t\tv.HeadCommit = &Commit{}\n\t\tv.HeadCommit.ID = v7\n\t\tv.HeadCommit.Message = v8\n\t\tv.HeadCommit.Timestamp = v9\n\t\tv.HeadCommit.Author = &Author{}\n\t\tv.HeadCommit.Author.Name = v10\n\t\tv.HeadCommit.Author.Email = v11\n\t\tv.HeadCommit.Author.Username = v12\n\t\tv.HeadCommit.Committer = &Author{}\n\t\tv.HeadCommit.Committer.Name = v13\n\t\tv.HeadCommit.Committer.Email = v14\n\t\tv.HeadCommit.Committer.Username = v15\n\n\t\tvv = append(vv, v)\n\t}\n\treturn vv, rows.Err()\n}\n\nfunc SliceHook(v *Hook) []interface{} {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 bool\n\tvar v7 string\n\tvar v8 string\n\tvar v9 string\n\tvar v10 string\n\tvar v11 string\n\tvar v12 string\n\tvar v13 string\n\tvar v14 string\n\tvar v15 string\n\n\tv0 = v.ID\n\tv1 = v.Sha\n\tv2 = v.After\n\tv3 = v.Before\n\tv4 = v.Created\n\tv5 = v.Deleted\n\tv6 = v.Forced\n\tif v.HeadCommit != nil {\n\t\tv7 = v.HeadCommit.ID\n\t\tv8 = v.HeadCommit.Message\n\t\tv9 = v.HeadCommit.Timestamp\n\t\tif v.HeadCommit.Author != nil {\n\t\t\tv10 = v.HeadCommit.Author.Name\n\t\t\tv11 = v.HeadCommit.Author.Email\n\t\t\tv12 = v.HeadCommit.Author.Username\n\t\t}\n\n\t}\n\n\tif v.HeadCommit.Committer != nil {\n\t\tv13 = v.HeadCommit.Committer.Name\n\t\tv14 = v.HeadCommit.Committer.Email\n\t\tv15 = v.HeadCommit.Committer.Username\n\t}\n\n\treturn []interface{}{\n\t\tv0,\n\t\tv1,\n\t\tv2,\n\t\tv3,\n\t\tv4,\n\t\tv5,\n\t\tv6,\n\t\tv7,\n\t\tv8,\n\t\tv9,\n\t\tv10,\n\t\tv11,\n\t\tv12,\n\t\tv13,\n\t\tv14,\n\t\tv15,\n\t}\n}\n\nfunc SelectHook(db *sql.DB, query string, args ...interface{}) (*Hook, error) {\n\trow := db.QueryRow(query, args...)\n\treturn ScanHook(row)\n}\n\nfunc SelectHooks(db *sql.DB, query string, args ...interface{}) ([]*Hook, error) {\n\trows, err := db.Query(query, args...)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer rows.Close()\n\treturn ScanHooks(rows)\n}\n\nfunc InsertHook(db *sql.DB, query string, v *Hook) error {\n\n\tres, err := db.Exec(query, SliceHook(v)[1:]...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tv.ID, err = res.LastInsertId()\n\treturn err\n}\n\nfunc UpdateHook(db *sql.DB, query string, v *Hook) error {\n\n\targs := SliceHook(v)[1:]\n\targs = append(args, v.ID)\n\t_, err := db.Exec(query, args...)\n\treturn err\n}\n\nconst CreateHookStmt = `\nCREATE TABLE IF NOT EXISTS hooks (\n hook_id                      INTEGER PRIMARY KEY AUTO_INCREMENT\n,hook_sha                     VARCHAR(512)\n,hook_after                   VARCHAR(512)\n,hook_before                  VARCHAR(512)\n,hook_created                 BOOLEAN\n,hook_deleted                 BOOLEAN\n,hook_forced                  BOOLEAN\n,hook_head_id                 VARCHAR(512)\n,hook_head_message            VARCHAR(512)\n,hook_head_timestamp          VARCHAR(512)\n,hook_head_author_name        VARCHAR(512)\n,hook_head_author_email       VARCHAR(512)\n,hook_head_author_username    VARCHAR(512)\n,hook_head_committer_name     VARCHAR(512)\n,hook_head_committer_email    VARCHAR(512)\n,hook_head_committer_username VARCHAR(512)\n);\n`\n\nconst InsertHookStmt = `\nINSERT INTO hooks (\n hook_sha\n,hook_after\n,hook_before\n,hook_created\n,hook_deleted\n,hook_forced\n,hook_head_id\n,hook_head_message\n,hook_head_timestamp\n,hook_head_author_name\n,hook_head_author_email\n,hook_head_author_username\n,hook_head_committer_name\n,hook_head_committer_email\n,hook_head_committer_username\n) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)\n`\n\nconst SelectHookStmt = `\nSELECT \n hook_id\n,hook_sha\n,hook_after\n,hook_before\n,hook_created\n,hook_deleted\n,hook_forced\n,hook_head_id\n,hook_head_message\n,hook_head_timestamp\n,hook_head_author_name\n,hook_head_author_email\n,hook_head_author_username\n,hook_head_committer_name\n,hook_head_committer_email\n,hook_head_committer_username\nFROM hooks \n`\n\nconst SelectHookRangeStmt = `\nSELECT \n hook_id\n,hook_sha\n,hook_after\n,hook_before\n,hook_created\n,hook_deleted\n,hook_forced\n,hook_head_id\n,hook_head_message\n,hook_head_timestamp\n,hook_head_author_name\n,hook_head_author_email\n,hook_head_author_username\n,hook_head_committer_name\n,hook_head_committer_email\n,hook_head_committer_username\nFROM hooks \nLIMIT ? OFFSET ?\n`\n\nconst SelectHookCountStmt = `\nSELECT count(1)\nFROM hooks \n`\n\nconst SelectHookPkeyStmt = `\nSELECT \n hook_id\n,hook_sha\n,hook_after\n,hook_before\n,hook_created\n,hook_deleted\n,hook_forced\n,hook_head_id\n,hook_head_message\n,hook_head_timestamp\n,hook_head_author_name\n,hook_head_author_email\n,hook_head_author_username\n,hook_head_committer_name\n,hook_head_committer_email\n,hook_head_committer_username\nFROM hooks \nWHERE hook_id=?\n`\n\nconst UpdateHookPkeyStmt = `\nUPDATE hooks SET \n hook_id=?\n,hook_sha=?\n,hook_after=?\n,hook_before=?\n,hook_created=?\n,hook_deleted=?\n,hook_forced=?\n,hook_head_id=?\n,hook_head_message=?\n,hook_head_timestamp=?\n,hook_head_author_name=?\n,hook_head_author_email=?\n,hook_head_author_username=?\n,hook_head_committer_name=?\n,hook_head_committer_email=?\n,hook_head_committer_username=? \nWHERE hook_id=?\n`\n\nconst DeleteHookPkeyStmt = `\nDELETE FROM hooks \nWHERE hook_id=?\n`\n"
  },
  {
    "path": "demo/issue.go",
    "content": "package demo\n\n//go:generate ../sqlgen -file issue.go -type Issue -pkg demo -o issue_sql.go -db postgres\n\ntype Issue struct {\n\tID       int64 `sql:\"pk: true, auto: true\"`\n\tNumber   int\n\tTitle    string   `sql:\"size: 512\"`\n\tBody     string   `sql:\"size: 2048\"`\n\tAssignee string   `sql:\"index: issue_assignee\"`\n\tState    string   `sql:\"size: 50\"`\n\tLabels   []string `sql:\"encode: json\"`\n\n\tlocked bool `sql:\"-\"`\n}\n"
  },
  {
    "path": "demo/issue_sql.go",
    "content": "package demo\n\n// THIS FILE WAS AUTO-GENERATED. DO NOT MODIFY.\n\nimport (\n\t\"database/sql\"\n\t\"encoding/json\"\n)\n\nfunc ScanIssue(row *sql.Row) (*Issue, error) {\n\tvar v0 int64\n\tvar v1 int\n\tvar v2 string\n\tvar v3 string\n\tvar v4 string\n\tvar v5 string\n\tvar v6 []byte\n\n\terr := row.Scan(\n\t\t&v0,\n\t\t&v1,\n\t\t&v2,\n\t\t&v3,\n\t\t&v4,\n\t\t&v5,\n\t\t&v6,\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &Issue{}\n\tv.ID = v0\n\tv.Number = v1\n\tv.Title = v2\n\tv.Body = v3\n\tv.Assignee = v4\n\tv.State = v5\n\tjson.Unmarshal(v6, &v.Labels)\n\n\treturn v, nil\n}\n\nfunc ScanIssues(rows *sql.Rows) ([]*Issue, error) {\n\tvar err error\n\tvar vv []*Issue\n\n\tvar v0 int64\n\tvar v1 int\n\tvar v2 string\n\tvar v3 string\n\tvar v4 string\n\tvar v5 string\n\tvar v6 []byte\n\n\tfor rows.Next() {\n\t\terr = rows.Scan(\n\t\t\t&v0,\n\t\t\t&v1,\n\t\t\t&v2,\n\t\t\t&v3,\n\t\t\t&v4,\n\t\t\t&v5,\n\t\t\t&v6,\n\t\t)\n\t\tif err != nil {\n\t\t\treturn vv, err\n\t\t}\n\n\t\tv := &Issue{}\n\t\tv.ID = v0\n\t\tv.Number = v1\n\t\tv.Title = v2\n\t\tv.Body = v3\n\t\tv.Assignee = v4\n\t\tv.State = v5\n\t\tjson.Unmarshal(v6, &v.Labels)\n\n\t\tvv = append(vv, v)\n\t}\n\treturn vv, rows.Err()\n}\n\nfunc SliceIssue(v *Issue) []interface{} {\n\tvar v0 int64\n\tvar v1 int\n\tvar v2 string\n\tvar v3 string\n\tvar v4 string\n\tvar v5 string\n\tvar v6 []byte\n\n\tv0 = v.ID\n\tv1 = v.Number\n\tv2 = v.Title\n\tv3 = v.Body\n\tv4 = v.Assignee\n\tv5 = v.State\n\tv6, _ = json.Marshal(&v.Labels)\n\n\treturn []interface{}{\n\t\tv0,\n\t\tv1,\n\t\tv2,\n\t\tv3,\n\t\tv4,\n\t\tv5,\n\t\tv6,\n\t}\n}\n\nfunc SelectIssue(db *sql.DB, query string, args ...interface{}) (*Issue, error) {\n\trow := db.QueryRow(query, args...)\n\treturn ScanIssue(row)\n}\n\nfunc SelectIssues(db *sql.DB, query string, args ...interface{}) ([]*Issue, error) {\n\trows, err := db.Query(query, args...)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer rows.Close()\n\treturn ScanIssues(rows)\n}\n\nfunc InsertIssue(db *sql.DB, query string, v *Issue) error {\n\n\tres, err := db.Exec(query, SliceIssue(v)[1:]...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tv.ID, err = res.LastInsertId()\n\treturn err\n}\n\nfunc UpdateIssue(db *sql.DB, query string, v *Issue) error {\n\n\targs := SliceIssue(v)[1:]\n\targs = append(args, v.ID)\n\t_, err := db.Exec(query, args...)\n\treturn err\n}\n\nconst CreateIssueStmt = `\nCREATE TABLE IF NOT EXISTS issues (\n issue_id       SERIAL PRIMARY KEY \n,issue_number   INTEGER\n,issue_title    VARCHAR(512)\n,issue_body     VARCHAR(2048)\n,issue_assignee VARCHAR(512)\n,issue_state    VARCHAR(50)\n,issue_labels   BYTEA\n);\n`\n\nconst InsertIssueStmt = `\nINSERT INTO issues (\n issue_number\n,issue_title\n,issue_body\n,issue_assignee\n,issue_state\n,issue_labels\n) VALUES ($1,$2,$3,$4,$5,$6)\n`\n\nconst SelectIssueStmt = `\nSELECT \n issue_id\n,issue_number\n,issue_title\n,issue_body\n,issue_assignee\n,issue_state\n,issue_labels\nFROM issues \n`\n\nconst SelectIssueRangeStmt = `\nSELECT \n issue_id\n,issue_number\n,issue_title\n,issue_body\n,issue_assignee\n,issue_state\n,issue_labels\nFROM issues \nLIMIT $1 OFFSET $2\n`\n\nconst SelectIssueCountStmt = `\nSELECT count(1)\nFROM issues \n`\n\nconst SelectIssuePkeyStmt = `\nSELECT \n issue_id\n,issue_number\n,issue_title\n,issue_body\n,issue_assignee\n,issue_state\n,issue_labels\nFROM issues \nWHERE issue_id=$1\n`\n\nconst UpdateIssuePkeyStmt = `\nUPDATE issues SET \n issue_id=$1\n,issue_number=$2\n,issue_title=$3\n,issue_body=$4\n,issue_assignee=$5\n,issue_state=$6\n,issue_labels=$7 \nWHERE issue_id=$8\n`\n\nconst DeleteIssuePkeyStmt = `\nDELETE FROM issues \nWHERE issue_id=$1\n`\n"
  },
  {
    "path": "demo/user.go",
    "content": "package demo\n\n//go:generate ../sqlgen -file user.go -type User -pkg demo -o user_sql.go\n\ntype User struct {\n\tID     int64  `sql:\"pk: true, auto: true\"`\n\tLogin  string `sql:\"unique: user_login\"`\n\tEmail  string `sql:\"unique: user_email\"`\n\tAvatar string\n\tActive bool\n\tAdmin  bool\n\n\t// oauth token and secret\n\ttoken  string\n\tsecret string\n\n\t// randomly generated hash used to sign user\n\t// session and application tokens.\n\thash string\n}\n"
  },
  {
    "path": "demo/user_sql.go",
    "content": "package demo\n\n// THIS FILE WAS AUTO-GENERATED. DO NOT MODIFY.\n\nimport (\n\t\"database/sql\"\n)\n\nfunc ScanUser(row *sql.Row) (*User, error) {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 string\n\tvar v7 string\n\tvar v8 string\n\n\terr := row.Scan(\n\t\t&v0,\n\t\t&v1,\n\t\t&v2,\n\t\t&v3,\n\t\t&v4,\n\t\t&v5,\n\t\t&v6,\n\t\t&v7,\n\t\t&v8,\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &User{}\n\tv.ID = v0\n\tv.Login = v1\n\tv.Email = v2\n\tv.Avatar = v3\n\tv.Active = v4\n\tv.Admin = v5\n\tv.token = v6\n\tv.secret = v7\n\tv.hash = v8\n\n\treturn v, nil\n}\n\nfunc ScanUsers(rows *sql.Rows) ([]*User, error) {\n\tvar err error\n\tvar vv []*User\n\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 string\n\tvar v7 string\n\tvar v8 string\n\n\tfor rows.Next() {\n\t\terr = rows.Scan(\n\t\t\t&v0,\n\t\t\t&v1,\n\t\t\t&v2,\n\t\t\t&v3,\n\t\t\t&v4,\n\t\t\t&v5,\n\t\t\t&v6,\n\t\t\t&v7,\n\t\t\t&v8,\n\t\t)\n\t\tif err != nil {\n\t\t\treturn vv, err\n\t\t}\n\n\t\tv := &User{}\n\t\tv.ID = v0\n\t\tv.Login = v1\n\t\tv.Email = v2\n\t\tv.Avatar = v3\n\t\tv.Active = v4\n\t\tv.Admin = v5\n\t\tv.token = v6\n\t\tv.secret = v7\n\t\tv.hash = v8\n\n\t\tvv = append(vv, v)\n\t}\n\treturn vv, rows.Err()\n}\n\nfunc SliceUser(v *User) []interface{} {\n\tvar v0 int64\n\tvar v1 string\n\tvar v2 string\n\tvar v3 string\n\tvar v4 bool\n\tvar v5 bool\n\tvar v6 string\n\tvar v7 string\n\tvar v8 string\n\n\tv0 = v.ID\n\tv1 = v.Login\n\tv2 = v.Email\n\tv3 = v.Avatar\n\tv4 = v.Active\n\tv5 = v.Admin\n\tv6 = v.token\n\tv7 = v.secret\n\tv8 = v.hash\n\n\treturn []interface{}{\n\t\tv0,\n\t\tv1,\n\t\tv2,\n\t\tv3,\n\t\tv4,\n\t\tv5,\n\t\tv6,\n\t\tv7,\n\t\tv8,\n\t}\n}\n\nfunc SelectUser(db *sql.DB, query string, args ...interface{}) (*User, error) {\n\trow := db.QueryRow(query, args...)\n\treturn ScanUser(row)\n}\n\nfunc SelectUsers(db *sql.DB, query string, args ...interface{}) ([]*User, error) {\n\trows, err := db.Query(query, args...)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer rows.Close()\n\treturn ScanUsers(rows)\n}\n\nfunc InsertUser(db *sql.DB, query string, v *User) error {\n\n\tres, err := db.Exec(query, SliceUser(v)[1:]...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tv.ID, err = res.LastInsertId()\n\treturn err\n}\n\nfunc UpdateUser(db *sql.DB, query string, v *User) error {\n\n\targs := SliceUser(v)[1:]\n\targs = append(args, v.ID)\n\t_, err := db.Exec(query, args...)\n\treturn err\n}\n\nconst CreateUserStmt = `\nCREATE TABLE IF NOT EXISTS users (\n user_id     INTEGER PRIMARY KEY AUTOINCREMENT\n,user_login  TEXT\n,user_email  TEXT\n,user_avatar TEXT\n,user_active BOOLEAN\n,user_admin  BOOLEAN\n,user_token  TEXT\n,user_secret TEXT\n,user_hash   TEXT\n);\n`\n\nconst InsertUserStmt = `\nINSERT INTO users (\n user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\n) VALUES (?,?,?,?,?,?,?,?)\n`\n\nconst SelectUserStmt = `\nSELECT \n user_id\n,user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\nFROM users \n`\n\nconst SelectUserRangeStmt = `\nSELECT \n user_id\n,user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\nFROM users \nLIMIT ? OFFSET ?\n`\n\nconst SelectUserCountStmt = `\nSELECT count(1)\nFROM users \n`\n\nconst SelectUserPkeyStmt = `\nSELECT \n user_id\n,user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\nFROM users \nWHERE user_id=?\n`\n\nconst UpdateUserPkeyStmt = `\nUPDATE users SET \n user_id=?\n,user_login=?\n,user_email=?\n,user_avatar=?\n,user_active=?\n,user_admin=?\n,user_token=?\n,user_secret=?\n,user_hash=? \nWHERE user_id=?\n`\n\nconst DeleteUserPkeyStmt = `\nDELETE FROM users \nWHERE user_id=?\n`\n\nconst CreateUserLoginStmt = `\nCREATE UNIQUE INDEX IF NOT EXISTS user_login ON users (user_login)\n`\n\nconst SelectUserLoginStmt = `\nSELECT \n user_id\n,user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\nFROM users \nWHERE user_login=?\n`\n\nconst UpdateUserLoginStmt = `\nUPDATE users SET \n user_id=?\n,user_login=?\n,user_email=?\n,user_avatar=?\n,user_active=?\n,user_admin=?\n,user_token=?\n,user_secret=?\n,user_hash=? \nWHERE user_login=?\n`\n\nconst DeleteUserLoginStmt = `\nDELETE FROM users \nWHERE user_login=?\n`\n\nconst CreateUserEmailStmt = `\nCREATE UNIQUE INDEX IF NOT EXISTS user_email ON users (user_email)\n`\n\nconst SelectUserEmailStmt = `\nSELECT \n user_id\n,user_login\n,user_email\n,user_avatar\n,user_active\n,user_admin\n,user_token\n,user_secret\n,user_hash\nFROM users \nWHERE user_email=?\n`\n\nconst UpdateUserEmailStmt = `\nUPDATE users SET \n user_id=?\n,user_login=?\n,user_email=?\n,user_avatar=?\n,user_active=?\n,user_admin=?\n,user_token=?\n,user_secret=?\n,user_hash=? \nWHERE user_email=?\n`\n\nconst DeleteUserEmailStmt = `\nDELETE FROM users \nWHERE user_email=?\n`\n"
  },
  {
    "path": "fmt.go",
    "content": "package main\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"os\"\n\t\"os/exec\"\n)\n\n// format formats a template using gofmt.\nfunc format(in io.Reader) (io.Reader, error) {\n\tvar out bytes.Buffer\n\n\tgofmt := exec.Command(\"gofmt\", \"-s\")\n\tgofmt.Stdin = in\n\tgofmt.Stdout = &out\n\tgofmt.Stderr = os.Stderr\n\terr := gofmt.Run()\n\treturn &out, err\n}\n"
  },
  {
    "path": "gen.go",
    "content": "package main\n\nimport (\n\t\"bytes\"\n\t\"flag\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\n\t\"github.com/drone/sqlgen/parse\"\n\t\"github.com/drone/sqlgen/schema\"\n)\n\nvar (\n\tinput      = flag.String(\"file\", \"\", \"input file name; required\")\n\toutput     = flag.String(\"o\", \"\", \"output file name; required\")\n\tpkgName    = flag.String(\"pkg\", \"main\", \"output package name; required\")\n\ttypeName   = flag.String(\"type\", \"\", \"type to generate; required\")\n\tdatabase   = flag.String(\"db\", \"sqlite\", \"sql dialect; required\")\n\tgenSchema  = flag.Bool(\"schema\", true, \"generate sql schema and queries\")\n\tgenFuncs   = flag.Bool(\"funcs\", true, \"generate sql helper functions\")\n\textraFuncs = flag.Bool(\"extras\", true, \"generate extra sql helper functions\")\n)\n\nfunc main() {\n\tflag.Parse()\n\n\t// parses the syntax tree into something a bit\n\t// easier to work with.\n\ttree, err := parse.Parse(*input, *typeName)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"%v\\n\", err)\n\t\tos.Exit(1)\n\t}\n\n\t// if the code is generated in a different folder\n\t// that the struct we need to import the struct\n\tif tree.Pkg != *pkgName && *pkgName != \"main\" {\n\t\t// TODO\n\t}\n\n\t// load the Tree into a schema Object\n\ttable := schema.Load(tree)\n\tdialect := schema.New(schema.Dialects[*database])\n\n\tvar buf bytes.Buffer\n\n\tif *genFuncs {\n\t\twritePackage(&buf, *pkgName)\n\t\twriteImports(&buf, tree, \"database/sql\")\n\t\twriteRowFunc(&buf, tree)\n\t\twriteRowsFunc(&buf, tree)\n\t\twriteSliceFunc(&buf, tree)\n\n\t\tif *extraFuncs {\n\t\t\twriteSelectRow(&buf, tree)\n\t\t\twriteSelectRows(&buf, tree)\n\t\t\twriteInsertFunc(&buf, tree)\n\t\t\twriteUpdateFunc(&buf, tree)\n\t\t}\n\t} else {\n\t\twritePackage(&buf, *pkgName)\n\t}\n\n\t// write the sql functions\n\tif *genSchema {\n\t\twriteSchema(&buf, dialect, table)\n\t}\n\n\t// formats the generated file using gofmt\n\tpretty, err := format(&buf)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"%v\\n\", err)\n\t\treturn\n\t}\n\n\t// create output source for file. defaults to\n\t// stdout but may be file.\n\tvar out io.WriteCloser = os.Stdout\n\tif *output != \"\" {\n\t\tout, err = os.Create(*output)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"%v\\n\", err)\n\t\t\treturn\n\t\t}\n\t\tdefer out.Close()\n\t}\n\n\tio.Copy(out, pretty)\n}\n"
  },
  {
    "path": "gen_funcs.go",
    "content": "package main\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\n\t\"github.com/acsellers/inflections\"\n\t\"github.com/drone/sqlgen/parse\"\n)\n\nfunc writeImports(w io.Writer, tree *parse.Node, pkgs ...string) {\n\tvar pmap = map[string]struct{}{}\n\n\t// add default packages\n\tfor _, pkg := range pkgs {\n\t\tpmap[pkg] = struct{}{}\n\t}\n\n\t// check each edge node to see if it is\n\t// encoded, which might require us to import\n\t// other packages\n\tfor _, node := range tree.Edges() {\n\t\tif node.Tags == nil || len(node.Tags.Encode) == 0 {\n\t\t\tcontinue\n\t\t}\n\t\tswitch node.Tags.Encode {\n\t\tcase \"json\":\n\t\t\tpmap[\"encoding/json\"] = struct{}{}\n\t\t\t// case \"gzip\":\n\t\t\t// \tpmap[\"compress/gzip\"] = struct{}{}\n\t\t\t// case \"snappy\":\n\t\t\t// \tpmap[\"github.com/golang/snappy\"] = struct{}{}\n\t\t}\n\t}\n\n\tif len(pmap) == 0 {\n\t\treturn\n\t}\n\n\t// write the import block, including each\n\t// encoder package that was specified.\n\tfmt.Fprintln(w, \"\\nimport (\")\n\tfor pkg, _ := range pmap {\n\t\tfmt.Fprintf(w, \"\\t%q\\n\", pkg)\n\t}\n\tfmt.Fprintln(w, \")\")\n}\n\nfunc writeSliceFunc(w io.Writer, tree *parse.Node) {\n\n\tvar buf1, buf2, buf3 bytes.Buffer\n\n\tvar i, depth int\n\tvar parent = tree\n\n\tfor _, node := range tree.Edges() {\n\t\tif node.Tags.Skip {\n\t\t\tcontinue\n\t\t}\n\n\t\t// temporary variable declaration\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, \"[]byte\")\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, node.Type)\n\t\t}\n\n\t\t// variable scanning\n\t\tfmt.Fprintf(&buf3, \"v%d,\\n\", i)\n\n\t\t// variable setting\n\t\tpath := node.Path()[1:]\n\n\t\t// if the parent is a ptr struct we\n\t\t// need to create a new\n\t\tif parent != node.Parent && node.Parent.Kind == parse.Ptr {\n\t\t\t// if node.Parent != nil && node.Parent.Parent != parent {\n\t\t\t// \tfmt.Fprintln(&buf2, \"}\\n\")\n\t\t\t// \tdepth--\n\t\t\t// }\n\n\t\t\t// seriously ... this works?\n\t\t\tif node.Parent != nil && node.Parent.Parent != parent {\n\t\t\t\tfor _, p := range path {\n\t\t\t\t\tif p == parent || depth == 0 {\n\t\t\t\t\t\tbreak\n\t\t\t\t\t}\n\t\t\t\t\tfmt.Fprintln(&buf2, \"}\\n\")\n\t\t\t\t\tdepth--\n\t\t\t\t}\n\t\t\t}\n\t\t\tdepth++\n\t\t\tfmt.Fprintf(&buf2, \"if v.%s != nil {\\n\", join(path[:len(path)-1], \".\"))\n\t\t}\n\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice, parse.Struct, parse.Ptr:\n\t\t\tfmt.Fprintf(&buf2, \"v%d, _ = json.Marshal(&v.%s)\\n\", i, join(path, \".\"))\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf2, \"v%d=v.%s\\n\", i, join(path, \".\"))\n\t\t}\n\n\t\tparent = node.Parent\n\t\ti++\n\t}\n\n\tfor depth != 0 {\n\t\tdepth--\n\t\tfmt.Fprintln(&buf2, \"}\\n\")\n\t}\n\n\tfmt.Fprintf(w,\n\t\tsSliceRow,\n\t\ttree.Type,\n\t\ttree.Type,\n\t\tbuf1.String(),\n\t\tbuf2.String(),\n\t\tbuf3.String(),\n\t)\n}\n\nfunc writeRowFunc(w io.Writer, tree *parse.Node) {\n\n\tvar buf1, buf2, buf3 bytes.Buffer\n\n\tvar i int\n\tvar parent = tree\n\tfor _, node := range tree.Edges() {\n\t\tif node.Tags.Skip {\n\t\t\tcontinue\n\t\t}\n\n\t\t// temporary variable declaration\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, \"[]byte\")\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, node.Type)\n\t\t}\n\n\t\t// variable scanning\n\t\tfmt.Fprintf(&buf2, \"&v%d,\\n\", i)\n\n\t\t// variable setting\n\t\tpath := node.Path()[1:]\n\n\t\t// if the parent is a ptr struct we\n\t\t// need to create a new\n\t\tif parent != node.Parent && node.Parent.Kind == parse.Ptr {\n\t\t\tfmt.Fprintf(&buf3, \"v.%s=&%s{}\\n\", join(path[:len(path)-1], \".\"), node.Parent.Type)\n\t\t}\n\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice, parse.Struct, parse.Ptr:\n\t\t\tfmt.Fprintf(&buf3, \"json.Unmarshal(v%d, &v.%s)\\n\", i, join(path, \".\"))\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf3, \"v.%s=v%d\\n\", join(path, \".\"), i)\n\t\t}\n\n\t\tparent = node.Parent\n\t\ti++\n\t}\n\n\tfmt.Fprintf(w,\n\t\tsScanRow,\n\t\ttree.Type,\n\t\ttree.Type,\n\t\tbuf1.String(),\n\t\tbuf2.String(),\n\t\ttree.Type,\n\t\tbuf3.String(),\n\t)\n}\n\nfunc writeRowsFunc(w io.Writer, tree *parse.Node) {\n\tvar buf1, buf2, buf3 bytes.Buffer\n\n\tvar i int\n\tvar parent = tree\n\tfor _, node := range tree.Edges() {\n\t\tif node.Tags.Skip {\n\t\t\tcontinue\n\t\t}\n\n\t\t// temporary variable declaration\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, \"[]byte\")\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf1, \"var v%d %s\\n\", i, node.Type)\n\t\t}\n\n\t\t// variable scanning\n\t\tfmt.Fprintf(&buf2, \"&v%d,\\n\", i)\n\n\t\t// variable setting\n\t\tpath := node.Path()[1:]\n\n\t\t// if the parent is a ptr struct we\n\t\t// need to create a new\n\t\tif parent != node.Parent && node.Parent.Kind == parse.Ptr {\n\t\t\tfmt.Fprintf(&buf3, \"v.%s=&%s{}\\n\", join(path[:len(path)-1], \".\"), node.Parent.Type)\n\t\t}\n\n\t\tswitch node.Kind {\n\t\tcase parse.Map, parse.Slice, parse.Struct, parse.Ptr:\n\t\t\tfmt.Fprintf(&buf3, \"json.Unmarshal(v%d, &v.%s)\\n\", i, join(path, \".\"))\n\t\tdefault:\n\t\t\tfmt.Fprintf(&buf3, \"v.%s=v%d\\n\", join(path, \".\"), i)\n\t\t}\n\n\t\tparent = node.Parent\n\t\ti++\n\t}\n\n\tfmt.Fprintf(w,\n\t\tsScanRows,\n\t\tinflections.Pluralize(tree.Type),\n\t\ttree.Type,\n\t\ttree.Type,\n\t\tbuf1.String(),\n\t\tbuf2.String(),\n\t\ttree.Type,\n\t\tbuf3.String(),\n\t)\n}\n\nfunc writeSelectRow(w io.Writer, tree *parse.Node) {\n\tfmt.Fprintf(w, sSelectRow, tree.Type, tree.Type, tree.Type)\n}\n\nfunc writeSelectRows(w io.Writer, tree *parse.Node) {\n\tplural := inflections.Pluralize(tree.Type)\n\tfmt.Fprintf(w, sSelectRows, plural, tree.Type, plural)\n}\n\nfunc writeInsertFunc(w io.Writer, tree *parse.Node) {\n\t// TODO this assumes I'm using the ID field.\n\t// we should not make that assumption\n\tfmt.Fprintf(w, sInsert, tree.Type, tree.Type, tree.Type)\n}\n\nfunc writeUpdateFunc(w io.Writer, tree *parse.Node) {\n\tfmt.Fprintf(w, sUpdate, tree.Type, tree.Type, tree.Type)\n}\n\n// join is a helper function that joins nodes\n// together by name using the seperator.\nfunc join(nodes []*parse.Node, sep string) string {\n\tvar parts []string\n\tfor _, node := range nodes {\n\t\tparts = append(parts, node.Name)\n\t}\n\treturn strings.Join(parts, sep)\n}\n"
  },
  {
    "path": "gen_schema.go",
    "content": "package main\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\n\t\"bitbucket.org/pkg/inflect\"\n\t\"github.com/drone/sqlgen/schema\"\n)\n\n// writeSchema writes SQL statements to CREATE, INSERT,\n// UPDATE and DELETE values from Table t.\nfunc writeSchema(w io.Writer, d schema.Dialect, t *schema.Table) {\n\n\twriteConst(w,\n\t\td.Table(t),\n\t\t\"create\", inflect.Singularize(t.Name), \"stmt\",\n\t)\n\n\twriteConst(w,\n\t\td.Insert(t),\n\t\t\"insert\", inflect.Singularize(t.Name), \"stmt\",\n\t)\n\n\twriteConst(w,\n\t\td.Select(t, nil),\n\t\t\"select\", inflect.Singularize(t.Name), \"stmt\",\n\t)\n\n\twriteConst(w,\n\t\td.SelectRange(t, nil),\n\t\t\"select\", inflect.Singularize(t.Name), \"range\", \"stmt\",\n\t)\n\n\twriteConst(w,\n\t\td.SelectCount(t, nil),\n\t\t\"select\", inflect.Singularize(t.Name), \"count\", \"stmt\",\n\t)\n\n\tif len(t.Primary) != 0 {\n\t\twriteConst(w,\n\t\t\td.Select(t, t.Primary),\n\t\t\t\"select\", inflect.Singularize(t.Name), \"pkey\", \"stmt\",\n\t\t)\n\n\t\twriteConst(w,\n\t\t\td.Update(t, t.Primary),\n\t\t\t\"update\", inflect.Singularize(t.Name), \"pkey\", \"stmt\",\n\t\t)\n\n\t\twriteConst(w,\n\t\t\td.Delete(t, t.Primary),\n\t\t\t\"delete\", inflect.Singularize(t.Name), \"pkey\", \"stmt\",\n\t\t)\n\t}\n\n\tfor _, ix := range t.Index {\n\n\t\twriteConst(w,\n\t\t\td.Index(t, ix),\n\t\t\t\"create\", ix.Name, \"stmt\",\n\t\t)\n\n\t\twriteConst(w,\n\t\t\td.Select(t, ix.Fields),\n\t\t\t\"select\", ix.Name, \"stmt\",\n\t\t)\n\n\t\tif !ix.Unique {\n\n\t\t\twriteConst(w,\n\t\t\t\td.SelectRange(t, ix.Fields),\n\t\t\t\t\"select\", ix.Name, \"range\", \"stmt\",\n\t\t\t)\n\n\t\t\twriteConst(w,\n\t\t\t\td.SelectCount(t, ix.Fields),\n\t\t\t\t\"select\", ix.Name, \"count\", \"stmt\",\n\t\t\t)\n\n\t\t} else {\n\n\t\t\twriteConst(w,\n\t\t\t\td.Update(t, ix.Fields),\n\t\t\t\t\"update\", ix.Name, \"stmt\",\n\t\t\t)\n\n\t\t\twriteConst(w,\n\t\t\t\td.Delete(t, ix.Fields),\n\t\t\t\t\"delete\", ix.Name, \"stmt\",\n\t\t\t)\n\t\t}\n\t}\n}\n\n// WritePackage writes the Go package header to\n// writer w with the given package name.\nfunc writePackage(w io.Writer, name string) {\n\tfmt.Fprintf(w, sPackage, name)\n}\n\n// writeConst is a helper function that writes the\n// body string to a Go const variable.\nfunc writeConst(w io.Writer, body string, label ...string) {\n\t// create a snake case variable name from\n\t// the specified labels. Then convert the\n\t// variable name to a quoted, camel case string.\n\tname := strings.Join(label, \"_\")\n\tname = inflect.Typeify(name)\n\n\t// quote the body using multi-line quotes\n\tbody = fmt.Sprintf(sQuote, body)\n\n\tfmt.Fprintf(w, sConst, name, body)\n}\n"
  },
  {
    "path": "parse/const.go",
    "content": "package parse\n\nconst (\n\tInvalid = iota\n\tBool\n\tInt\n\tInt8\n\tInt16\n\tInt32\n\tInt64\n\tUint\n\tUint8\n\tUint16\n\tUint32\n\tUint64\n\tFloat32\n\tFloat64\n\tComplex64\n\tComplex128\n\tInterface\n\tBytes\n\tMap\n\tPtr\n\tString\n\tSlice\n\tStruct\n)\n\nvar Types = map[string]uint8{\n\t\"bool\":        Bool,\n\t\"int\":         Int,\n\t\"int8\":        Int8,\n\t\"int16\":       Int16,\n\t\"int32\":       Int32,\n\t\"int64\":       Int64,\n\t\"uint\":        Uint,\n\t\"uint8\":       Uint8,\n\t\"uint16\":      Uint16,\n\t\"uint32\":      Uint32,\n\t\"uint64\":      Uint64,\n\t\"float32\":     Float32,\n\t\"float64\":     Float64,\n\t\"complex64\":   Complex64,\n\t\"complex128\":  Complex128,\n\t\"interface{}\": Interface,\n\t\"[]byte\":      Bytes,\n\t\"string\":      String,\n}\n"
  },
  {
    "path": "parse/node.go",
    "content": "package parse\n\ntype Node struct {\n\tPkg  string // source code package.\n\tName string // source code name.\n\tKind uint8  // source code kind.\n\tType string // source code type.\n\tTags *Tag\n\n\tParent *Node\n\tNodes  []*Node\n}\n\nfunc (n *Node) append(node *Node) {\n\tnode.Parent = n\n\tn.Nodes = append(n.Nodes, node)\n}\n\n// Walk traverses the node tree, invoking the callback\n// function for each node that is traversed.\nfunc (n *Node) Walk(fn func(*Node)) {\n\tfor _, node := range n.Nodes {\n\t\tfn(node)\n\t\tnode.Walk(fn)\n\t}\n}\n\n// WalkRev traverses the tree in reverse order, invoking\n// the callback function for each parent node until\n// the root node is reached.\nfunc (n *Node) WalkRev(fn func(*Node)) {\n\tif n.Parent != nil {\n\t\tn.Parent.WalkRev(fn)\n\t}\n\tfn(n) // this was previously inside the if block\n}\n\n// Edges returns a flattened list of all edge\n// nodes in the Tree.\nfunc (n *Node) Edges() []*Node {\n\tvar nodes []*Node\n\tn.Walk(func(node *Node) {\n\t\tif len(node.Nodes) == 0 {\n\t\t\tnodes = append(nodes, node)\n\t\t}\n\t})\n\treturn nodes\n}\n\n// Path returns the absolute path of the node\n// in the Tree.\nfunc (n *Node) Path() []*Node {\n\tvar nodes []*Node\n\tn.WalkRev(func(node *Node) {\n\t\tnodes = append(nodes, node)\n\t})\n\treturn nodes\n}\n"
  },
  {
    "path": "parse/parse.go",
    "content": "package parse\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"go/ast\"\n\t\"go/parser\"\n\t\"go/token\"\n)\n\nvar (\n\tErrTypeNotFound = errors.New(\"Cannot find type in the source code.\")\n\tErrTypeInvalid  = errors.New(\"Cannot convert type to a SQL type.\")\n)\n\nfunc Parse(path, name string) (*Node, error) {\n\n\tvar fset = token.NewFileSet()\n\tvar file, err = parser.ParseFile(fset, path, nil, parser.ParseComments)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tfor _, decl := range file.Decls {\n\t\tgen, ok := decl.(*ast.GenDecl)\n\t\tif !ok {\n\t\t\tcontinue\n\t\t}\n\t\tspec, ok := gen.Specs[0].(*ast.TypeSpec)\n\t\tif !ok {\n\t\t\tcontinue\n\t\t}\n\t\tif spec.Name.String() != name {\n\t\t\tcontinue\n\t\t}\n\n\t\tvar node = new(Node)\n\t\tnode.Name = spec.Name.String()\n\t\tnode.Type = spec.Name.String()\n\t\tnode.Pkg = file.Name.Name\n\t\terr = buildNodes(node, spec)\n\t\treturn node, err\n\t}\n\n\treturn nil, ErrTypeNotFound\n}\n\nfunc buildNodes(parent *Node, spec *ast.TypeSpec) error {\n\tident, ok := spec.Type.(*ast.StructType)\n\tif !ok {\n\t\treturn ErrTypeInvalid\n\t}\n\n\tfor _, field := range ident.Fields.List {\n\t\tvar tag string\n\t\tif field.Tag != nil {\n\t\t\ttag = field.Tag.Value\n\t\t}\n\t\tbuildNode(parent, field.Type, field.Names[0].Name, tag)\n\t}\n\treturn nil\n}\n\nfunc buildNode(parent *Node, expr ast.Expr, name, tag string) error {\n\tvar err error\n\n\tswitch ident := expr.(type) {\n\tcase *ast.Ident:\n\t\tif ident.Obj == nil {\n\t\t\tnode := &Node{\n\t\t\t\tName: name,\n\t\t\t\tType: ident.Name,\n\t\t\t\tKind: Types[ident.Name],\n\t\t\t}\n\t\t\tnode.Tags, err = parseTag(tag)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tparent.append(node)\n\t\t\treturn nil\n\t\t}\n\t\tspec, ok := ident.Obj.Decl.(*ast.TypeSpec)\n\t\tif !ok {\n\t\t\tgoto invalidType\n\t\t}\n\t\tnode := &Node{\n\t\t\tName: name,\n\t\t\tType: ident.Name,\n\t\t\tKind: Struct,\n\t\t}\n\t\tnode.Tags, err = parseTag(tag)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tparent.append(node)\n\t\treturn buildNodes(node, spec)\n\n\tcase *ast.ArrayType:\n\t\tif ident.Len != nil {\n\t\t\tgoto invalidType\n\t\t}\n\t\tnode := &Node{\n\t\t\tName: name,\n\t\t\tKind: Slice,\n\t\t\tType: fmt.Sprintf(\"[]%s\", ident.Elt),\n\t\t}\n\t\tnode.Tags, err = parseTag(tag)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif node.Type == \"[]byte\" {\n\t\t\tnode.Kind = Bytes\n\t\t}\n\t\tparent.append(node)\n\t\treturn nil\n\n\tcase *ast.MapType:\n\t\ttype_ := fmt.Sprintf(\"map[%s]%s\", ident.Key, ident.Value)\n\t\tnode := &Node{Name: name, Type: type_, Kind: Map}\n\t\tnode.Tags, err = parseTag(tag)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tparent.append(node)\n\t\treturn nil\n\n\tcase *ast.StarExpr:\n\t\tinnerIdent, ok := ident.X.(*ast.Ident)\n\t\tif !ok {\n\t\t\tgoto invalidType\n\t\t}\n\t\tif innerIdent.Obj == nil || innerIdent.Obj.Decl == nil {\n\t\t\tgoto invalidType\n\t\t}\n\t\tspec, ok := innerIdent.Obj.Decl.(*ast.TypeSpec)\n\t\tif !ok {\n\t\t\tgoto invalidType\n\t\t}\n\t\tnode := &Node{Name: name, Type: innerIdent.Name, Kind: Ptr}\n\t\tnode.Tags, err = parseTag(tag)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif node.Tags.Skip {\n\t\t\treturn nil\n\t\t}\n\t\tparent.append(node)\n\t\treturn buildNodes(node, spec)\n\t}\n\ninvalidType:\n\treturn fmt.Errorf(\"%s is not a valid type\", name)\n}\n"
  },
  {
    "path": "parse/tag.go",
    "content": "package parse\n\nimport (\n\t\"fmt\"\n\t\"reflect\"\n\t\"strings\"\n\n\t\"gopkg.in/yaml.v2\"\n)\n\nconst (\n\tEncodeGzip = \"gzip\"\n\tEncodeJson = \"json\"\n)\n\n// Tag stores the parsed data from the tag string in\n// a struct field.\ntype Tag struct {\n\tName    string `yaml:\"name\"`\n\tType    string `yaml:\"type\"`\n\tPrimary bool   `yaml:\"pk\"`\n\tAuto    bool   `yaml:\"auto\"`\n\tIndex   string `yaml:\"index\"`\n\tUnique  string `yaml:\"unique\"`\n\tSize    int    `yaml:\"size\"`\n\tSkip    bool   `yaml:\"skip\"`\n\tEncode  string `yaml:\"encode\"`\n}\n\n// parseTag parses a tag string from the struct\n// field and unmarshals into a Tag struct.\nfunc parseTag(raw string) (*Tag, error) {\n\tvar tag = new(Tag)\n\n\traw = strings.Replace(raw, \"`\", \"\", -1)\n\traw = reflect.StructTag(raw).Get(\"sql\")\n\n\t// if the tag indicates the field should\n\t// be skipped we can exit right away.\n\tif strings.TrimSpace(raw) == \"-\" {\n\t\ttag.Skip = true\n\t\treturn tag, nil\n\t}\n\n\t// otherwise wrap the string in curly braces\n\t// so that we can use the Yaml parser.\n\traw = fmt.Sprintf(\"{ %s }\", raw)\n\n\t// unmarshals the Yaml formatted string into\n\t// the Tag structure.\n\tvar err = yaml.Unmarshal([]byte(raw), tag)\n\treturn tag, err\n}\n"
  },
  {
    "path": "parse/tag_test.go",
    "content": "package parse\n\nimport (\n\t\"reflect\"\n\t\"testing\"\n)\n\nvar tagTests = []struct {\n\traw string\n\ttag *Tag\n}{\n\t{\n\t\t`sql:\"-\"`,\n\t\t&Tag{Skip: true},\n\t},\n\t{\n\t\t`sql:\"pk: true, auto: true\"`,\n\t\t&Tag{Primary: true, Auto: true},\n\t},\n\t{\n\t\t`sql:\"name: foo\"`,\n\t\t&Tag{Name: \"foo\"},\n\t},\n\t{\n\t\t`sql:\"type: varchar\"`,\n\t\t&Tag{Type: \"varchar\"},\n\t},\n\t{\n\t\t`sql:\"size: 2048\"`,\n\t\t&Tag{Size: 2048},\n\t},\n\t{\n\t\t`sql:\"index: fake_index\"`,\n\t\t&Tag{Index: \"fake_index\"},\n\t},\n\t{\n\t\t`sql:\"unique: fake_unique_index\"`,\n\t\t&Tag{Unique: \"fake_unique_index\"},\n\t},\n}\n\nfunc TestParseTag(t *testing.T) {\n\tfor _, test := range tagTests {\n\n\t\tvar want = test.tag\n\t\tvar got, err = parseTag(test.raw)\n\n\t\tif err != nil {\n\t\t\tt.Errorf(\"Got Error parsing Tag %s. %s\", test.raw, err)\n\t\t}\n\n\t\tif !reflect.DeepEqual(got, want) {\n\t\t\tt.Errorf(\"Wanted Tag %+v, got Tag %+v\", want, got)\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "schema/base.go",
    "content": "package schema\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\t\"text/tabwriter\"\n)\n\ntype base struct {\n\tDialect Dialect\n}\n\n// Table returns a SQL statement to create the table.\nfunc (b *base) Table(t *Table) string {\n\n\t// use a large default buffer size of so that\n\t// the tabbing doesn't get prematurely flushed\n\t// resulting in un-even lines.\n\tvar byt = make([]byte, 0, 100000)\n\tvar buf = bytes.NewBuffer(byt)\n\n\t// use a tab writer to evenly space the column\n\t// names and column types.\n\tvar tab = tabwriter.NewWriter(buf, 0, 8, 1, ' ', 0)\n\tb.columnw(tab, t.Fields, false, false, true)\n\n\t// flush the tab writer to write to the buffer\n\ttab.Flush()\n\n\treturn fmt.Sprintf(\"CREATE TABLE IF NOT EXISTS %s (%s\\n);\", t.Name, buf.String())\n}\n\n// Index returns a SQL statement to create the index.\nfunc (b *base) Index(table *Table, index *Index) string {\n\tvar obj = \"INDEX\"\n\tif index.Unique {\n\t\tobj = \"UNIQUE INDEX\"\n\t}\n\treturn fmt.Sprintf(\"CREATE %s IF NOT EXISTS %s ON %s (%s)\", obj, index.Name, table.Name, b.columns(index.Fields, true, false, false))\n}\n\nfunc (b *base) Insert(t *Table) string {\n\tvar fields []*Field\n\tvar params []string\n\tvar i int\n\n\tfor _, field := range t.Fields {\n\t\tif !field.Auto {\n\t\t\tfields = append(fields, field)\n\t\t\tparams = append(params, b.Dialect.Param(i))\n\t\t\ti++\n\t\t}\n\t}\n\n\treturn fmt.Sprintf(\"INSERT INTO %s (%s\\n) VALUES (%s)\", t.Name, b.columns(fields, false, false, false), strings.Join(params, \",\"))\n}\n\nfunc (b *base) Update(t *Table, fields []*Field) string {\n\treturn fmt.Sprintf(\"UPDATE %s SET %s %s\", t.Name, b.columns(t.Fields, false, true, false), b.clause(fields, len(t.Fields)))\n}\n\nfunc (b *base) Delete(t *Table, fields []*Field) string {\n\treturn fmt.Sprintf(\"DELETE FROM %s %s\", t.Name, b.clause(fields, 0))\n}\n\nfunc (b *base) Select(t *Table, fields []*Field) string {\n\treturn fmt.Sprintf(\"SELECT %s\\nFROM %s %s\", b.columns(t.Fields, false, false, false), t.Name, b.clause(fields, 0))\n}\n\nfunc (b *base) SelectRange(t *Table, fields []*Field) string {\n\treturn fmt.Sprintf(\"SELECT %s\\nFROM %s %s\\nLIMIT %s OFFSET %s\", b.columns(t.Fields, false, false, false), t.Name, b.clause(fields, 0), b.Dialect.Param(len(fields)), b.Dialect.Param(len(fields)+1))\n}\n\nfunc (b *base) SelectCount(t *Table, fields []*Field) string {\n\treturn fmt.Sprintf(\"SELECT count(1)\\nFROM %s %s\", t.Name, b.clause(fields, 0))\n}\n\n// Param returns the parameters symbol used in prepared\n// sql statements.\nfunc (b *base) Param(i int) string {\n\treturn \"?\"\n}\n\n// Column returns a SQL type for the given field.\n//\n// For Mysql and Postgres see:\n// https://github.com/eaigner/hood/blob/master/mysql.go#L35\nfunc (b *base) Column(f *Field) string {\n\tswitch f.Type {\n\tcase INTEGER:\n\t\treturn \"INTEGER\"\n\tcase BOOLEAN:\n\t\treturn \"BOOLEAN\"\n\tcase BLOB:\n\t\treturn \"BLOB\"\n\tcase VARCHAR:\n\t\treturn \"TEXT\"\n\tdefault:\n\t\treturn \"TEXT\"\n\t}\n}\n\n// Token returns the SQL string for the requested token.\nfunc (b *base) Token(v int) (_ string) {\n\tswitch v {\n\tcase AUTO_INCREMENT:\n\t\treturn \"AUTOINCREMENT\"\n\tcase PRIMARY_KEY:\n\t\treturn \"PRIMARY KEY\"\n\tdefault:\n\t\treturn\n\t}\n}\n\n// helper function to generate a block of columns. You\n// can optionally generate in inline list of columns,\n// include an assignment operator, and include column\n// definitions.\nfunc (b *base) columns(fields []*Field, inline, assign, def bool) string {\n\tvar buf bytes.Buffer\n\tb.columnw(&buf, fields, inline, assign, def)\n\treturn buf.String()\n}\n\n// helper function to write a block of columns to w.\nfunc (b *base) columnw(w io.Writer, fields []*Field, inline, assign, def bool) {\n\n\tfor i, field := range fields {\n\t\tif !inline {\n\t\t\tio.WriteString(w, \"\\n\")\n\t\t}\n\n\t\tswitch {\n\t\tcase i == 0 && !inline:\n\t\t\tio.WriteString(w, \" \")\n\t\tcase i != 0:\n\t\t\tio.WriteString(w, \",\")\n\t\t}\n\t\tio.WriteString(w, field.Name)\n\n\t\tif assign {\n\t\t\tio.WriteString(w, \"=\")\n\t\t\tio.WriteString(w, b.Dialect.Param(i))\n\t\t}\n\n\t\tif !def {\n\t\t\tcontinue\n\t\t}\n\n\t\tio.WriteString(w, \"\\t\")\n\t\tio.WriteString(w, b.Dialect.Column(field))\n\n\t\tif field.Primary {\n\t\t\tio.WriteString(w, \" \")\n\t\t\tio.WriteString(w, b.Dialect.Token(PRIMARY_KEY))\n\t\t}\n\n\t\tif field.Auto {\n\t\t\tio.WriteString(w, \" \")\n\t\t\tio.WriteString(w, b.Dialect.Token(AUTO_INCREMENT))\n\t\t}\n\t}\n}\n\n// helper function to generate the Where clause\n// section of a SQL statement\nfunc (b *base) clause(fields []*Field, pos int) string {\n\tvar buf bytes.Buffer\n\n\tvar i int\n\tfor _, field := range fields {\n\t\tbuf.WriteString(\"\\n\")\n\t\tswitch {\n\t\tcase i == 0:\n\t\t\tbuf.WriteString(\"WHERE\")\n\t\tdefault:\n\t\t\tbuf.WriteString(\"AND\")\n\t\t}\n\n\t\tbuf.WriteString(\" \")\n\t\tbuf.WriteString(field.Name)\n\t\tbuf.WriteString(\"=\")\n\t\tbuf.WriteString(b.Dialect.Param(i + pos))\n\n\t\ti++\n\t}\n\treturn buf.String()\n}\n"
  },
  {
    "path": "schema/dialect.go",
    "content": "package schema\n\nconst (\n\tSQLITE int = iota\n\tPOSTGRES\n\tMYSQL\n)\n\nvar Dialects = map[string]int{\n\t\"sqlite\":   SQLITE,\n\t\"postgres\": POSTGRES,\n\t\"mysql\":    MYSQL,\n}\n\ntype Dialect interface {\n\tTable(*Table) string\n\tIndex(*Table, *Index) string\n\tColumn(*Field) string\n\tInsert(*Table) string\n\tUpdate(*Table, []*Field) string\n\tDelete(*Table, []*Field) string\n\tSelect(*Table, []*Field) string\n\tSelectCount(*Table, []*Field) string\n\tSelectRange(*Table, []*Field) string\n\tParam(int) string\n\tToken(int) string\n}\n\nfunc New(dialect int) Dialect {\n\tswitch dialect {\n\tcase POSTGRES:\n\t\treturn newPosgres()\n\tcase MYSQL:\n\t\treturn newMysql()\n\tdefault:\n\t\treturn newSqlite()\n\t}\n}\n"
  },
  {
    "path": "schema/dialect_mysql.go",
    "content": "package schema\n\nimport (\n\t\"fmt\"\n)\n\ntype mysql struct {\n\tbase\n}\n\nfunc newMysql() Dialect {\n\td := &mysql{}\n\td.base.Dialect = d\n\treturn d\n}\n\nfunc (d *mysql) Column(f *Field) (_ string) {\n\tswitch f.Type {\n\tcase INTEGER:\n\t\treturn \"INTEGER\"\n\tcase BOOLEAN:\n\t\treturn \"BOOLEAN\"\n\tcase BLOB:\n\t\treturn \"MEDIUMBLOB\"\n\tcase VARCHAR:\n\t\t// assigns an arbitrary size if\n\t\t// none is provided.\n\t\tsize := f.Size\n\t\tif size == 0 {\n\t\t\tsize = 512\n\t\t}\n\t\treturn fmt.Sprintf(\"VARCHAR(%d)\", size)\n\tdefault:\n\t\treturn\n\t}\n}\n\nfunc (d *mysql) Token(v int) (_ string) {\n\tswitch v {\n\tcase AUTO_INCREMENT:\n\t\treturn \"AUTO_INCREMENT\"\n\tcase PRIMARY_KEY:\n\t\treturn \"PRIMARY KEY\"\n\tdefault:\n\t\treturn\n\t}\n}\n"
  },
  {
    "path": "schema/dialect_postgres.go",
    "content": "package schema\n\nimport (\n\t\"fmt\"\n)\n\ntype posgres struct {\n\tbase\n}\n\nfunc newPosgres() Dialect {\n\td := &posgres{}\n\td.base.Dialect = d\n\treturn d\n}\n\nfunc (d *posgres) Column(f *Field) (_ string) {\n\t// posgres uses a special column type\n\t// to autoincrementing keys.\n\tif f.Auto {\n\t\treturn \"SERIAL\"\n\t}\n\n\tswitch f.Type {\n\tcase INTEGER:\n\t\treturn \"INTEGER\"\n\tcase BOOLEAN:\n\t\treturn \"BOOLEAN\"\n\tcase BLOB:\n\t\treturn \"BYTEA\"\n\tcase VARCHAR:\n\t\t// assigns an arbitrary size if\n\t\t// none is provided.\n\t\tsize := f.Size\n\t\tif size == 0 {\n\t\t\tsize = 512\n\t\t}\n\t\treturn fmt.Sprintf(\"VARCHAR(%d)\", size)\n\tdefault:\n\t\treturn\n\t}\n}\n\nfunc (d *posgres) Token(v int) (_ string) {\n\tswitch v {\n\tcase AUTO_INCREMENT:\n\t\t// postgres does not support the\n\t\t// auto-increment keyword.\n\t\treturn\n\tcase PRIMARY_KEY:\n\t\treturn \"PRIMARY KEY\"\n\tdefault:\n\t\treturn\n\t}\n}\n\nfunc (d *posgres) Param(i int) string {\n\treturn fmt.Sprintf(\"$%d\", i+1)\n}\n"
  },
  {
    "path": "schema/dialect_sqlite.go",
    "content": "package schema\n\ntype sqlite struct {\n\tbase\n}\n\nfunc newSqlite() Dialect {\n\td := &sqlite{}\n\td.base.Dialect = d\n\treturn d\n}\n"
  },
  {
    "path": "schema/helper.go",
    "content": "package schema\n\nimport (\n\t\"strings\"\n\n\t\"github.com/acsellers/inflections\"\n\t\"github.com/drone/sqlgen/parse\"\n)\n\nfunc Load(tree *parse.Node) *Table {\n\ttable := new(Table)\n\n\t// local map of indexes, used for quick\n\t// lookups and de-duping.\n\tindexs := map[string]*Index{}\n\n\t// pluralizes the table name and then\n\t// formats in snake case.\n\ttable.Name = inflections.Underscore(tree.Type)\n\ttable.Name = inflections.Pluralize(table.Name)\n\n\t// each edge node in the tree is a column\n\t// in the table. Convert each edge node to\n\t// a Field structure.\n\tfor _, node := range tree.Edges() {\n\n\t\tfield := new(Field)\n\n\t\t// Lookup the SQL column type\n\t\t// TODO: move this to a function\n\t\tt, ok := parse.Types[node.Type]\n\t\tif ok {\n\t\t\ttt, ok := types[t]\n\t\t\tif !ok {\n\t\t\t\ttt = BLOB\n\t\t\t}\n\t\t\tfield.Type = tt\n\t\t} else {\n\t\t\tfield.Type = BLOB\n\t\t}\n\n\t\t// substitute tag variables\n\t\tif node.Tags != nil {\n\n\t\t\tif node.Tags.Skip {\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\t// default ID and int64 to primary key\n\t\t\t// with auto-increment\n\t\t\tif node.Name == \"ID\" && node.Kind == parse.Int64 {\n\t\t\t\tnode.Tags.Primary = true\n\t\t\t\tnode.Tags.Auto = true\n\t\t\t}\n\n\t\t\tfield.Auto = node.Tags.Auto\n\t\t\tfield.Primary = node.Tags.Primary\n\t\t\tfield.Size = node.Tags.Size\n\n\t\t\tif node.Tags.Primary {\n\t\t\t\ttable.Primary = append(table.Primary, field)\n\t\t\t}\n\n\t\t\tif node.Tags.Index != \"\" {\n\t\t\t\tindex, ok := indexs[node.Tags.Index]\n\t\t\t\tif !ok {\n\t\t\t\t\tindex = new(Index)\n\t\t\t\t\tindex.Name = node.Tags.Index\n\t\t\t\t\tindexs[index.Name] = index\n\t\t\t\t\ttable.Index = append(table.Index, index)\n\t\t\t\t}\n\t\t\t\tindex.Fields = append(index.Fields, field)\n\t\t\t}\n\n\t\t\tif node.Tags.Unique != \"\" {\n\t\t\t\tindex, ok := indexs[node.Tags.Index]\n\t\t\t\tif !ok {\n\t\t\t\t\tindex = new(Index)\n\t\t\t\t\tindex.Name = node.Tags.Unique\n\t\t\t\t\tindex.Unique = true\n\t\t\t\t\tindexs[index.Name] = index\n\t\t\t\t\ttable.Index = append(table.Index, index)\n\t\t\t\t}\n\t\t\t\tindex.Fields = append(index.Fields, field)\n\t\t\t}\n\n\t\t\tif node.Tags.Type != \"\" {\n\t\t\t\tt, ok := sqlTypes[node.Tags.Type]\n\t\t\t\tif ok {\n\t\t\t\t\tfield.Type = t\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t\t// get the full path name\n\t\tpath := node.Path()\n\t\tvar parts []string\n\t\tfor _, part := range path {\n\t\t\tif part.Tags != nil && part.Tags.Name != \"\" {\n\t\t\t\tparts = append(parts, part.Tags.Name)\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\tparts = append(parts, part.Name)\n\t\t}\n\t\tfield.Name = strings.Join(parts, \"_\")\n\t\tfield.Name = inflections.Underscore(field.Name)\n\n\t\ttable.Fields = append(table.Fields, field)\n\t}\n\n\treturn table\n}\n\n// convert Go types to SQL types.\nvar types = map[uint8]int{\n\tparse.Bool:       BOOLEAN,\n\tparse.Int:        INTEGER,\n\tparse.Int8:       INTEGER,\n\tparse.Int16:      INTEGER,\n\tparse.Int32:      INTEGER,\n\tparse.Int64:      INTEGER,\n\tparse.Uint:       INTEGER,\n\tparse.Uint8:      INTEGER,\n\tparse.Uint16:     INTEGER,\n\tparse.Uint32:     INTEGER,\n\tparse.Uint64:     INTEGER,\n\tparse.Float32:    INTEGER,\n\tparse.Float64:    INTEGER,\n\tparse.Complex64:  INTEGER,\n\tparse.Complex128: INTEGER,\n\tparse.Interface:  BLOB,\n\tparse.Bytes:      BLOB,\n\tparse.String:     VARCHAR,\n\tparse.Map:        BLOB,\n\tparse.Slice:      BLOB,\n}\n\nvar sqlTypes = map[string]int{\n\t\"text\":     VARCHAR,\n\t\"varchar\":  VARCHAR,\n\t\"varchar2\": VARCHAR,\n\t\"number\":   INTEGER,\n\t\"integer\":  INTEGER,\n\t\"int\":      INTEGER,\n\t\"blob\":     BLOB,\n\t\"bytea\":    BLOB,\n}\n"
  },
  {
    "path": "schema/schema.go",
    "content": "package schema\n\n// List of basic types\nconst (\n\tINTEGER int = iota\n\tVARCHAR\n\tBOOLEAN\n\tREAL\n\tBLOB\n)\n\n// List of vendor-specific keywords\nconst (\n\tAUTO_INCREMENT = iota\n\tPRIMARY_KEY\n)\n\ntype Table struct {\n\tName string\n\n\tFields  []*Field\n\tIndex   []*Index\n\tPrimary []*Field\n}\n\ntype Field struct {\n\tName    string\n\tType    int\n\tPrimary bool\n\tAuto    bool\n\tSize    int\n}\n\ntype Index struct {\n\tName   string\n\tUnique bool\n\n\tFields []*Field\n}\n"
  },
  {
    "path": "tmpl.go",
    "content": "package main\n\n// template to create a constant variable.\nvar sConst = `\nconst %s = %s\n`\n\n// template to wrap a string in multi-line quotes.\nvar sQuote = \"`\\n%s\\n`\"\n\n// template to declare the package name.\nvar sPackage = `\npackage %s\n\n// THIS FILE WAS AUTO-GENERATED. DO NOT MODIFY.\n`\n\n// template to delcare the package imports.\nvar sImport = `\nimport (\n\t%s\n)\n`\n\n// function template to scan a single row.\nconst sScanRow = `\nfunc Scan%s(row *sql.Row) (*%s, error) {\n\t%s\n\n\terr := row.Scan(\n\t\t%s\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tv := &%s{}\n\t%s\n\n\treturn v, nil\n}\n`\n\n// function template to scan multiple rows.\nconst sScanRows = `\nfunc Scan%s(rows *sql.Rows) ([]*%s, error) {\n\tvar err error\n\tvar vv []*%s\n\n\t%s\n\tfor rows.Next() {\n\t\terr = rows.Scan(\n\t\t\t%s\n\t\t)\n\t\tif err != nil {\n\t\t\treturn vv, err\n\t\t}\n\n\t\tv := &%s{}\n\t\t%s\n\t\tvv = append(vv, v)\n\t}\n\treturn vv, rows.Err()\n}\n`\n\nconst sSliceRow = `\nfunc Slice%s(v *%s) []interface{} {\n\t%s\n\t%s\n\n\treturn []interface{}{\n\t\t%s\n\t}\n}\n`\n\nconst sSelectRow = `\nfunc Select%s(db *sql.DB, query string, args ...interface{}) (*%s, error) {\n\trow := db.QueryRow(query, args...)\n\treturn Scan%s(row)\n}\n`\n\n// function template to select multiple rows.\nconst sSelectRows = `\nfunc Select%s(db *sql.DB, query string, args ...interface{}) ([]*%s, error) {\n\trows, err := db.Query(query, args...)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer rows.Close()\n\treturn Scan%s(rows)\n}\n`\n\n// function template to insert a single row.\nconst sInsert = `\nfunc Insert%s(db *sql.DB, query string, v *%s) error {\n\n\tres, err := db.Exec(query, Slice%s(v)[1:]...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tv.ID, err = res.LastInsertId()\n\treturn err\n}\n`\n\n// function template to update a single row.\nconst sUpdate = `\nfunc Update%s(db *sql.DB, query string, v *%s) error {\n\n\targs := Slice%s(v)[1:]\n\targs = append(args, v.ID)\n\t_, err := db.Exec(query, args...)\n\treturn err \n}\n`\n"
  },
  {
    "path": "util.go",
    "content": "package main\n"
  }
]