Showing preview only (793K chars total). Download the full file or copy to clipboard to get everything.
Repository: minimaxir/download-tweets-ai-text-gen
Branch: master
Commit: 673465d1acd6
Files: 13
Total size: 768.6 KB
Directory structure:
gitextract_pv4k9zy1/
├── DNC_usernames.txt
├── GOP_usernames.txt
├── LICENSE
├── README.md
├── download_tweets.py
├── examples/
│ ├── JanelleCShane_355M.txt
│ ├── JanelleCShane_355M_2.txt
│ ├── MagicRealismBot_355M.txt
│ ├── chrissyteigen_355M.txt
│ ├── elonmusk_355M.txt
│ └── minimaxir_355M.txt
├── github/
│ └── FUNDING.yml
└── requirements.txt
================================================
FILE CONTENTS
================================================
================================================
FILE: DNC_usernames.txt
================================================
SeemaNanda
SpeakerPelosi
SenSchumer
RashidaTlaib
AyannaPressley
brianschatz
AOC
IlhanMN
joncoopertweets
TheDemCoalition
funder
DavidCornDC
DNCWarRoom
tribelaw
NatashaBertrand
RepSwalwell
TomPerez
RonWyden
RBReigh
SenatorDurbin
================================================
FILE: GOP_usernames.txt
================================================
realDonaldTrump
IvankaTrump
DonaldJTrumpJr
JudgeJeanine
parscale
GOPLeader
senatemajldr
AnnCoulter
SenTedCruz
GovMikeHuckabee
IvankaTrump
tedcruz
newtgingrich
erictrump
LindseyGrahamSC
BillOreilly
DevinNunes
IngrahamAngle
seanhannity
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2019-2020 Max Woolf
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# download-tweets-ai-text-gen
A small Python 3 script to download public Tweets from a given Twitter account into a format suitable for AI text generation tools (such as [gpt-2-simple](https://github.com/minimaxir/gpt-2-simple) for finetuning [GPT-2](https://openai.com/blog/better-language-models/)).
* Retrieves all tweets as a simple CSV with a single CLI command.
* Preprocesses tweets to remove URLs, extra spaces, and optionally usertags/hashtags.
* Saves tweets in batches (i.e. there is an error or you want to end collection early)
You can view examples of AI-generated tweets from datasets retrieved with this tool in the `/examples` folder.
Inspired by popular demand due to the success of [@dril_gpt2](https://twitter.com/dril_gpt2).
## Usage
First, install the Python script dependencies:
```sh
pip3 install twint==2.1.4 fire tqdm
```
Then download the `download_tweets.py` script from this repo.
The script is interacted via a command line interface. After `cd`ing into the directory where the script is stored in a terminal, run:
```sh
python3 download_tweets.py <twitter_username>
```
e.g. If you want to download all tweets (sans retweets/replies/quote tweets) from Twitter user [@dril](https://twitter.com/dril_gpt2), run:
```sh
python3 download_tweets.py dril
```
The script can can also download tweets from multiple usernames at one time. To do so, first create a text file (.txt) with the list of usernames. Then, run script referencing the file name:
```sh
python3 download_tweets.py <twitter_usernames_file_name>
```
The tweets will be downloaded to a single-column CSV titled `<usernames>_tweets.csv`.
The parameters you can pass to the command line interface (positionally or explicitly) are:
* username: Username of the account whose tweets or .txt file name with multiple usernames you want to download [required]
* limit: Number of tweets to download [default: all tweets possible]
* include_replies: Include replies from the user in the dataset [default: False]
* strip_usertags: Strips out `@` user tags in the tweet text [default: False]
* strip_hashtags: Strips out `#` hashtags in the tweet text [default: False]
## How to Train an AI on the downloaded tweets
[gpt-2-simple](https://github.com/minimaxir/gpt-2-simple) has a special case for single-column CSVs, where it will automatically process the text for best training and generation. (i.e. by adding `<|startoftext|>` and `<|endoftext|>` to each tweet, allowing independent generation of tweets)
You can use [this Colaboratory notebook](https://colab.research.google.com/drive/1qxcQ2A1nNjFudAGN_mcMOnvV9sF_PkEb) (optimized from the original notebook for this use case) to train the model on your downloaded tweets, and generate massive amounts of Tweets from it. Note that without a lot of data, the model might easily overfit; you may want to train for fewer `steps` (e.g. `500`).
When generating, you'll always need to include certain parameters to decode the tweets, e.g.:
```python
gpt2.generate(sess,
length=200,
temperature=0.7,
prefix='<|startoftext|>',
truncate='<|endoftext|>',
include_prefix=False
)
```
## Helpful Notes
* Retweets are not included in the downloaded dataset. (which is generally a good thing)
* You'll need *thousands* of tweets at minimum to feed to the input model for a good generation results. (ideally 1 MB of input text data, although with tweets that hard to achieve)
* To help you reach the 1 MB of input text data, you can load data from multiple similar Twitter usernames
* The download will likely end much earlier than the theoretical limit (inferred from the user profile) as the limit includes retweets/replies/whatever cache shennanigans Twitter is employing.
* The legalities of distributing downloaded tweets is ambigious, therefore it's recommended avoiding commiting raw Twitter data to GitHub, and is the reason examples of such data is not included in this repo. (AI-generated tweets themselves likely fall under derivative work/parody protected by Fair Use)
## Maintainer/Creator
Max Woolf ([@minimaxir](https://minimaxir.com))
*Max's open-source projects are supported by his [Patreon](https://www.patreon.com/minimaxir) and [GitHub Sponsors](https://github.com/sponsors/minimaxir). If you found this project helpful, any monetary contributions to the Patreon are appreciated and will be put to good creative use.*
## License
MIT
## Disclaimer
This repo has no affiliation with Twitter Inc.
================================================
FILE: download_tweets.py
================================================
import twint
import fire
import re
import csv
from tqdm import tqdm
import logging
from datetime import datetime
from time import sleep
import os
# Surpress random twint warnings
logger = logging.getLogger()
logger.disabled = True
def is_reply(tweet):
"""
Determines if the tweet is a reply to another tweet.
Requires somewhat hacky heuristics since not included w/ twint
"""
# If not a reply to another user, there will only be 1 entry in reply_to
if len(tweet.reply_to) == 1:
return False
# Check to see if any of the other users "replied" are in the tweet text
users = tweet.reply_to[1:]
conversations = [user["username"] in tweet.tweet for user in users]
# If any if the usernames are not present in text, then it must be a reply
if sum(conversations) < len(users):
return True
return False
def download_tweets(
username=None,
limit=None,
include_replies=False,
include_links=False,
strip_usertags=False,
strip_hashtags=False,
):
"""Download public Tweets from a given Twitter account
into a format suitable for training with AI text generation tools.
:param username: Twitter @ username to gather tweets.
:param limit: # of tweets to gather; None for all tweets.
:param include_replies: Whether to include replies to other tweets.
:param strip_usertags: Whether to remove user tags from the tweets.
:param strip_hashtags: Whether to remove hashtags from the tweets.
:param include_links: Whether to include tweets with links.
:return tweets: List of tweets from the Twitter account
"""
# If a limit is specificed, validate that it is a multiple of 20
if limit:
assert limit % 20 == 0, "`limit` must be a multiple of 20."
# If no limit specifed, estimate the total number of tweets from profile.
else:
c_lookup = twint.Config()
c_lookup.Username = username
c_lookup.Store_object = True
c_lookup.Hide_output = True
if include_links is True:
c_lookup.Links = "include"
else:
c_lookup.Links = "exclude"
twint.run.Lookup(c_lookup)
limit = twint.output.users_list[-1].tweets
pattern = r"http\S+|pic\.\S+|\xa0|…"
if strip_usertags:
pattern += r"|@[a-zA-Z0-9_]+"
if strip_hashtags:
pattern += r"|#[a-zA-Z0-9_]+"
# Create an empty file to store pagination id
with open(".temp", "w", encoding="utf-8") as f:
f.write(str(-1))
print("Retrieving tweets for @{}...".format(username))
with open("{}_tweets.csv".format(username), "w", encoding="utf8") as f:
w = csv.writer(f)
w.writerow(["tweets"]) # gpt-2-simple expects a CSV header by default
pbar = tqdm(range(limit), desc="Oldest Tweet")
for i in range((limit // 20) - 1):
tweet_data = []
# twint may fail; give it up to 5 tries to return tweets
for _ in range(0, 4):
if len(tweet_data) == 0:
c = twint.Config()
c.Store_object = True
c.Hide_output = True
c.Username = username
c.Limit = 40
c.Resume = ".temp"
c.Store_object_tweets_list = tweet_data
twint.run.Search(c)
# If it fails, sleep before retry.
if len(tweet_data) == 0:
sleep(1.0)
else:
continue
# If still no tweets after multiple tries, we're done
if len(tweet_data) == 0:
c = twint.Config()
c.Store_object = True
c.Hide_output = True
c.Username = username
c.Limit = 40
c.Resume = ".temp"
c.Store_object_tweets_list = tweet_data
if not include_replies:
tweets = [
re.sub(pattern, "", tweet.tweet).strip()
for tweet in tweet_data
if not is_reply(tweet)
]
# On older tweets, if the cleaned tweet starts with an "@",
# it is a de-facto reply.
for tweet in tweets:
if tweet != "" and not tweet.startswith("@"):
w.writerow([tweet])
else:
tweets = [
re.sub(pattern, "", tweet.tweet).strip() for tweet in tweet_data
]
for tweet in tweets:
if tweet != "":
w.writerow([tweet])
if i > 0:
pbar.update(20)
else:
pbar.update(40)
oldest_tweet = datetime.utcfromtimestamp(
tweet_data[-1].datetime / 1000.0
).strftime("%Y-%m-%d %H:%M:%S")
pbar.set_description("Oldest Tweet: " + oldest_tweet)
pbar.close()
os.remove(".temp")
if __name__ == "__main__":
fire.Fire(download_tweets)
================================================
FILE: examples/JanelleCShane_355M.txt
================================================
This is one of my favorite neural network sounds. Like a bellwether.
====================
I trained a neural network on the ten thousand most frequently-entered titles.
Here are the most commonly-entered movies.
(via )
====================
I would love to reprint a figure from a 1998 proceeding in my upcoming book:
====================
I trained a neural network to generate new names for fireworks, and they showed a disturbing tendency to revert to more primitive firework designs.
These were mostly because the neural network had already learned all the firework designs.
One notable exception: this image.
➜Fireworks are forbidden in the Skinner Box
====================
People are suggesting other ways to do this - for example, by running a neural network through taxonomies. Any chance you could help out with a dataset for a potential future book?
====================
In retrospect, when I trained the neural network for a living, this was one of the most depressing times.
I wish I'd known what to expect.
====================
See you there!
====================
XML-RPC over HTTP:
====================
Note to whoever is operating these airports that they are NOT computers
They are *NOT* computers
====================
If you like procedurally generated nonsense, I highly recommend this free demo from. Look at the little gold tooth in the corner.
====================
I have the most recent of the two neural net created ball caps, thanks to . The word "bog" is written in big black letters. "Not fancy"
In retrospect, when I trained a neural network on Halloween costumes, I should have seen this coming.
====================
An abundance of other possible explanations. For one: neural network spike suppression. For another: I-90 in particular. (1/2)
====================
Omg read the first 2 lines. Super excited for this!
====================
I'm just 100 characters, but thanks to I've got a pretty good idea of how they do it.
====================
My friend Kelly Manley made these for the St. Patrick's Festival.
They were soadorably sweet.
Tickets are free but the heckler rule applies.
====================
I guess not. If you are a company with a product that people use, you are eligible.
====================
It looks so cozy! :3
====================
I'm just now noticing the pattern of fish species that pop up in the news. These could be your friends. Or the next set of #PlushGiraffeFishingPerturbations
====================
I used to illustrate the neural network's favored Halloween costumes. The "real" costumes are in bold.
Candy Hearts for the brave.(?!)
====================
The neural network would never, ever do that. Ever.
They'd never forgive you. Never. Fucking. Allow it
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider" or "I Love You, Campbell" depending on your mood. Callers to my mom's in Alaska get a personalized story about a boy they didn’t choose but are very happy about.
====================
“We now have a fully functioning neural network that can generate new cake recipes.”
====================
Conceptually, it's a lot simpler than that. A bunch of arrows pointing in the same direction make the book's text more legible.
====================
The neural network has other ideas about what humans are.
in particular, it sees snakes as a kind of cool animal.
h/t, for the link
enhanced gif via
====================
The neural net gained legs and a snout and now has a full-size toilet.
====================
Notifications from clients who paid by e-mail about an API I was using for deep learning research.
Some clients are listed in the original post.
To whom it may concern:
Thank you so much for doing this!
It means a lot to me.
I'd encourage everyone to read the original paper (which includes a link to an annotated copy).
====================
In case you missed this podcast! Had fun talking about the mindbending weirdness of the #SkyKnit project.
====================
<|startoftext|>I just called mine! My call is:
My call is:
My call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
And my call is:
====================
Aurora show visible right now at the South Pole Station webcam!
====================
Noted. Just started reading Inbublious by . It's so much fun.
The planet Salix is lined with skyscrapers.
The polar cap is cauliflower.
====================
The neural network trained on already has a pretty good handle on fishing. But as far as recipes are concerned, it's a bit ahead of the curve.
====================
Phobos and Ganymede have a mutual disdain for one another. Both despise humanity. Both crave it.
====================
First appearance of a neural network animal in the NYTimes. Then, for the umpteenth time, it appears to be writing about something other than humans.
At least it tried.
at least it tried hard enough.
i don't know how long it had been trying to write about something else until the cows started writing about it.
i don't either.
i try to imagine what it must have been like to be on the run from the authorities
====================
<|startoftext|>I just called mine! My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
My call is:
<|startoftext|>My cousin just called. His house is in flames. People running
====================
In the meantime, there are still plenty of fantastic books here.
Check them out:
====================
The neural network trained for a month on 82 million Amazon product reviews.
41% of reviewers endorsed this cereal.
35% of the time, the ~full-text description was completely blank.
21% of the time, the ~full-text description was completely, 100% wrong.
====================
This sounds like a blast from my youth! Like working on valve or something.
I used to do a bot at work that would generate stories for us to read afterwards.
====================
<|startoftext|>The model learns to look like a human being if you ignore the pretty much everything else about them. It has learned to use pretty much everything else to describe humans.
At some point, though, the humans get too familiar. The laser twitches, the human face contortions, the whole shebang become second nature to the neural net.
At some point, though, the humans get too familiar. The neural net starts looking more and more human. At some point, though, the humans get too familiar. The neural net starts looking more and more human. At some point, though, the humans get too familiar. The neural net starts looking more and more human. At some point, though, the humans get too familiar. The neural net starts looking more and more human. At some point, though, the humans get too familiar. The neural net starts looking human. At some point, though, the humans get too familiar. The neural net starts looking human. At
====================
The beauty of deep learning is that it can do almost anything - that's why it's fascinated by wheels.
It discovered that a wheel does not a train make, but rather a series of blocks stacked on top of each other.
So a train of 1 blockcars does not a train make, but rather a series of stacked blocks.
- this is why they call them "holey barn"
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
New England Patriots
Bloody Mess
Bubblegum
Mr. Sinister
Pandora's Box
Glow-a-Bye Beret
My Little Pony
Skull Candy
Elmer's Chiffon
My Little Pony
Pie
Elmer's Chiffon
My Dad's a carpenter and I like that.
I am also a bit on the squeamish side, but I like that it's a craft.
====================
I am thinking themed. A hint: chocolatekitty
====================
The neural net's cooking is *exactly* what I was expecting
====================
The recipe the neural network generated was *exactly* what I was looking for
====================
This is one of my favorite neural network builds. The other is of a castle.
====================
The neural networks I've trained on the internet are bad at this. Most of them are.
But this one?
(Tried to train this one too, using the same data, but x =10 is bias)
====================
<|startoftext|>I see nothing wrong with this 100% peer-reviewed academic research paper that was part of a conference on overfitting.
It was part of a conference on presentation biases, so it got *me*
But it's not like presenting a duck with a lion's tooth meant you could *actually* present a duck without a tooth
it just so happens that presenting a duck without a tooth is more aesthetically pleasing
So presenting a duck without a tooth is *more* aesthetically pleasing
So presenting a duck without a tooth is *more* aesthetically pleasing
so presenting a duck without a tooth is *more* aesthetically pleasing
so presenting a duck without a tooth is *more* aesthetically pleasing
so presenting a duck without a tooth is *more* aesthetically pleasing
so presenting a duck without a tooth is *more* aesthetically pleasing
so presenting a duck without a tooth is *more* aesthetically pleasing
====================
One of the problems with the neural network's poetry is that it tends to repeat the same basic plot elements.
One example:
====================
No JavaScript? We need you! Add here some weird C++ code and we'll post a neural net-generated article for you.
====================
I like how, while the neural network tried to reproduce patterns in the real world, it also tried to generate new titles based on those patterns. (Incidentally: Thats Angela Bassett)
====================
My old software did not make the software binary, but it did make it debug output that was more coherent.
Nowadays it doesn't work on .NET now, but it did on .NET 1.1.
That's not to say that it didn't work on .NET 1.1 - there was a time when it was possible to do that.
“Working memory corruption is bad news not just because it can lead to a program crash or data loss that would otherwise impair its usefulness, but also because it can lead to subtle but important graphical glitches that can make your computer or network unworkable”
====================
Brought a tomato plant for my 1st! My neighbors had better things to do with their time. —Hank Yuen
====================
This is one of my favorite neural network pipes.
triceratops is the only one that doesn’t have talons. that means this is a ku-’m tree?
====================
It looks so cozy! :3
====================
And now I have something else planned. A luncheon with at ! Details to come.
====================
I was 6 and it had:
1. a donkey
2. two bears
3. a cow
4. a pony
5. two chickens
6. a quill
7. a tatoo*
8. a lamp
9. a candle
10. a wick
11. a gargoyle
12. a christmas tree*
*invalid*
#plushGiraffe
#plushGiraffe
#plushGiraffe
#plushGiraffe
#plushGiraffe
====================
I trained a neural network to generate Christmas movies, and they showed a disturbing tendency to turn into terrifying monologues.
In retrospect, when you're done being creepy, tell me here.
The neural net: yours is not John Carpenter's.
Machine: not quite. There's room for improvement.
====================
It is time to call your reps, tell me here, and I'll give you a neural net D&D spell that will make your life easier.
Thanks,
Lots more examples here.
(one-third of the set is )
====================
I'm just now getting into Destiny. Found a few interesting NPCs.
IRL: "Owl Lady"
====================
Not sure if the neural net is naming cats or wolves until this point. Have the bears been naughty?
====================
AI will give bad advice, even if you're not suicidal
====================
It works! Behold:
Raspberry Pie
Apple Pie
Snape
Guinea Pig
Hog bobbit
Pom-puff hat
Tit Bits
====================
The solution seems to be “more text, more photosynthetic bacteria that use light to photosynthesize instead of food chains or photosynthesize into water.”
By Row 9, there are 10 photosynthetic bacteria. By Row 10, there are 30.”
====================
In a strange twist of events, a neural network is about to up the ante when it comes to generating Dungeons and Dragons bios.
Beep Boop bio is a species of its own.
====================
This is so great. Follow and I'll run your simulation experiments for you.
====================
You are WRONG about bees.
At least one bee.
====================
“It’s not about the cat, it's not about the ball, it's not even “about me.”
====================
I just called mine! My call is:
1. Lena Headey
2. Dr. Sbait
3. Dogwood
4. Jell-O
5. Fungus
6. BlobbyBob
7. Vending Machines
====================
This is one of the most delightful neural network-generated ballgames. Bonus: the "winners" are all human.
h/t for the link)
====================
Supposing I were to do a crochet version of Hoopoe. Would you mind if I made some noise?
====================
It looks so cozy! Was afraid of heights. Haven't climbed down yet. But it looks so cozy!
====================
Still can’t get over how creepy/cool neural net generated sewers are.
Also, the time travel stuff!
====================
<|startoftext|>The neural network considers the following to be challenges:
Challenge one:
Challenge two:
Challenge three:
Challenge four:
Challenge five:
Challenge six:
Challenge seven:
Challenge eight:
Challenge nine:
Challenge ten:
Challenge eleven:
Challenge twelve:
Challenge thirteen:
Challenge fourteen:
Challenge fifteen:
Challenge sixteen:
Challenge 17:
Challenge 18:
Challenge 19:
Challenge 20:
Challenge 21:
Challenge 22:
Challenge 23:
Challenge 24:
Challenge 25:
Challenge 26:
Challenge 27:
Challenge 28:
Challenge 29:
====================
Agreed! Here's the first #SkyKnit project. Knitting and crocheting have been going well. Part II here:
====================
a neural network could write some of today's great sci fi & fantasy
====================
Looking forward to this! #BoulderFalls
====================
The folks are over in the livestream comments right now answering questions
====================
However, it appears that the "good" systems do in fact learn to do the "right" things - at least when it comes to avoiding capture.
What's more, they seem to learn this from watching and copying others.
I suspect this strategy will be especially useful in non-human skiers
====================
I have the greatest ideas! If anyone here has read my blog, you will understand why I am so delighted by this.
====================
And this is a quality bot.
====================
The answer is always the TARDIS.
====================
The armadillo is one of the most disconcerting models. Not sure what to make of it.
====================
The neural net usually does a good job of this. Though I'm guessing it stills accounts receivable from before the Catapult Fight.
====================
CLMUS! or something along the lines of:
====================
I wrote a program that could generate lines from
and it did just that
====================
For those of you asking where I managed to find a shirt that didn't contain chicken or wok.
====================
Never say never to plagiarism, but reporting is suspect. First: the "real" authorship of a given phrase. Second: the "legitimate" or "whitelisted" uses of that phrase. Third: the "possible" or "implausible" uses of that phrase. Fourth: the "likely" or "unseen" uses of that phrase.
====================
The folks are over in the livestream comments right now answering questions
====================
The brain-bending weirdness is powered by the underlying technology, which FLCL, Torchbeak, and IMAGE-based texturing all learned to do one thing very well.
FLCL, Torchbeak, and IMAGE-based texturing all learned to do one thing very well.
FLCL, Torchbeak, and IMAGE-based texturing all still need training, but FLCL, IMAGE, and GAN all do a much wider variety of weird things.
FLCL, Torchbeak, and IMAGE-based texturing all still need training, but GAN, FLCL, and IMAGE seem to do a much broader variety of weird things.
====================
I trained a neural network to generate the Great Danes, but the joke's on them!
(Garnet, crap bowl)
(to play in the Neutrogena tournament)
There's just something about the smell of rotten fish that just brings tears to my eyes.
====================
My friend Kelly Manley made these for the . They feature a baseball or soft pretzel
My mom made these for the . They feature a baseball or soft pretzel
For science!
====================
Update: I trained a neural network on Shakespeare in Act 1 and it did not produce a script. It just asked for a list of words. I am giving it:
Obliterating
Bitter
Bull
Dog
Fart
Garnish
Hateful
====================
The neural network produces some of the most disturbing roller coasters.
====================
I love this bot so much
====================
The neural net trained for a month on and .
trained for a month on and . Tall building, wide courtyard.
and all that jazz.
now it tries to figure out how the books work.
book I just published:
book II coming soon.
will be talking in about & & the like.
so if you're around, I hope to talk about & the like.
====================
In retrospect, when I trained a neural network to generate chess openings, I should have seen this coming.
====================
It looks so cozy! Was afraid my cat would get stuck in the snow. He is a True Believer.
====================
My Brilliant Friend's Eyeo sounds amazing. Thoughts from the AI research community?
====================
My old research group is looking for undergrad *anthropology* volunteers to run experimental projects over the summer. I'm especially interested in:-- bird conservation.
animal conservation.
human health.
environmental science.
*anthropology* is short for "anthropology of animals" but in this context it's really good at referring to other kinds of studies. So if I'm lucky enough to run into you, I'm happy to share a few prawns.
My old research group is also looking for alumni who might have helped run some of these experiments. If you're a researcher, I'd love to talk to you!
====================
I just called mine. Call yours and I'll post a neural net-generated pie for you.
Call yours and I'll post a new neural net-generated pie for you.
====================
You may now call me the Awesome One.
====================
I think maybe I need one of these. Any chance you could send one to me?
====================
I was 6 and it had:
1. A rat
2. A cricket
3. A basketball
4. A swimming pool
====================
The neural network's bathrobe is so out-there - wobbly, bulbous, scary - that it won a gong show.
====================
So i got an early copy of Things Fall To Keep From Stinking to Drink by & ! Both excellent choices.
And Also, The Dream-Quest Of Vellitt Boe by - I've read all her books.
Also, Consider These Bitter Olives by my Aunt Bitter, who still makes them when they're cold.
it's probably best not to eat them, she says.
====================
One of the things I like is the clustering of images. Two animals, another bird, and now this.
====================
One of the most extraordinary neural net-generated paintings ever.
"Tiny House" is a reference.
====================
I would like to watch this
====================
Here's an alternative:
====================
While the neural net was at it, it also generated farts.
====================
So this is what you get if you ask Siri not to drive: a woman in a wheelchair, with bad posture and a very heavy dollop of snow on her head.
====================
Even if I don't get paid for this, this site and its algorithms are going to make me. And that's important.
====================
My bill: 1/2 price for all my pies, mince pies, and even Senecaum's famous "bite of cherry" pies.
2/2 price for all future pies.
====================
Periscope 2 is the cutest thing on the block!
====================
Omigosh this is one of my favorite books too! Came across it at w/a book in my local ai library.
====================
I am Asmodeus in the House of Commons! And this! And that! And I despise this and hope that whoever did this doesn’t get away with it.
====================
The model also learned from experience. People with pre-existing health conditions were less likely to be served by a model with an existing medical condition.
And no, AI isn't *asking* for your medical records or anything. It's just asking for *what's in the best interest of the patient*.
Which is why, oddly, the most aggressive AI is also the one least likely to steal.
People with existing medical conditions (and especially pre-existing conditions) are much less likely to be served by an AI with a history of stealing.
People with no medical history are much less likely to be served by an AI with a history of stealing.
People with no medical history are much less likely to be served by an AI with a history of stealing.
People with no history of medical problems are much less likely to be served by an AI with a history of stealing.
====================
A reader named Jitka has given this kitten a home! Meet Jexley Pickle.
====================
So if I was going to use neural networks for text generation, what the hell would be the best thing to do?
Well, first: don't take the bait.
Second: don't take the unskilled labor of a neural network.
And finally: don't take the unskilled labor of a neural network--or the unskilled labor of a computer, for that matter--and use it to create useful, not useless, AI text.
Not that it's unskilled, mind you. AI did all of that work for you.
But it IS unskilled, and it tried its best.
====================
What was once a single-purpose research institute is now actively seeking new ways to use AI.
The Institute for Creative Technology is hiring for Research Chair, Teaching Fellow or Research Assistant.
====================
This is one of the most delightful neural net fish species I know.
====================
In retrospect, when I trained a neural network to generate new cats and dogs, I should have seen this coming.
====================
I think maybe I'll try this link and then .
it gives a random person with asthma a ticket to see the new york today.
====================
Notifications of okay/bad/incorrect search terms.
One for each possible outcome of a neural network training on images of cats.
====================
Web apps are about the only ones where I can actually see the people/places in the app and that's really impressive. People are like, "I'm from Waukesha, Wisconsin, and I work in a bakery" or "I'm from Quincy, Massachusetts, and I'm a software engineer"
====================
I think the real value of these is they give machine learning researchers the ability to tweak the parameters in interesting ways.
Not obvious from the title though.
====================
I am thinking of ways this technology could be used. Any ideas?
====================
I had fun with this demo - more to come.
====================
I've got more Halloween costumes for sale!
The prices are $2-$10, depending on what you want.
anything from sci fi to horror
halloween movies?
====================
The recipe the neural network generated was wrong on purpose. It was trying to make veal. Not chicken.
====================
These neural net-generated pub names aren’t what’s making your skin crawl
====================
And it's not just sea saplings. In fact, it's all kinds of stuff. Some of it is GROSS.
====================
<|startoftext|>Those who are interested in the other, earlier NeuralTalk2 drafts can find them here:
<|startoftext|>neuraltalk2 draft-1:<|startoftext|>neuraltalk2 draft-2:<|startoftext|>neuraltalk2 draft-3:<|startoftext|>neuraltalk2 draft-4:<|startoftext|>neuraltalk2 draft-5:<|startoftext|>neuraltalk2 draft-6:<|startoftext|>neuraltalk2 draft-7:<|startoftext|>neuraltalk2 draft-8:<|startoftext|>neuraltalk2 draft-9:<|startoftext|>neuraltalk2 draft-10:<|startoftext|>neuraltalk2 draft-11:<|startoftext|>neuraltalk2 draft-12:<
====================
Vastly enjoying #murderbot by . I’ll retry it in a heartbeat.
====================
the folks are over in the livestream comments right now answering questions
====================
It's not like they didn’t use AI all along the way. They just didn’t automate it like humans’s job was to use AI.
at 1:00 in the above video
do you have a moment?
do you want to help?
h/t: for the link
====================
Omg this is art!!! Fulltext, and link to the repo here:
====================
The reason the neural network produces such bizarre results is that it has NO idea what the human input was. It was either making fun of something, or trying to make fun of something.
====================
Some birds are here! And that's not saying a whole lot - this is a very small sample size. Still, birds seem to be here.
====================
My old research group is looking for speakers for its next con. More on that soon.
====================
The neural network vision thing never gets old.
====================
You are WRONG about math. Math is not this dumb.
====================
If anyone in Colorado has allergies, I apologize on behalf of the cookies. :(
====================
<|startoftext|>I mean, not this time. This time, a neural network tried to reproduce the psalms of the Bible.
The bots also tried to reproduce the nineteenth century English style of writing, but they failed because they were all set up by human translators.
This time, a neural network tried to reproduce the style of the GAN prose, but instead of doing a human translation, it just copied what it saw as the translation target.
This time, a neural network tried to imitate the way humans write, but instead of actually writing English, it just tried to imitate what humans wrote.
This time, a neural network tried to write prose, but instead of actually writing English, it just tried to copy what humans wrote.
This time a neural network tried to write epistles, but instead of actually writing to human ears, it just tried to copy the way they write.
This time a neural network tried to write novels, but
====================
Only thing more terrifying than a neural network is a neural network without a goal.
====================
I talked with of about the neural net-generated song titles.
Check them out:
====================
The folks are over in the livestream comments right now answering questions & getting weird looks. Stuff they're not supposed to show.
====================
ShoelessJane is so beautiful!
====================
My friend Kelly Manley made these for the /r/awlias reddit.
They are from "a snowy owl, with snot nose and big green eyes"
Foreshadowing is awesome.
Thanks, .
Loved this thread. Insane amount of hype.
🦒🦒🦒
====================
The neural network's lines were on today!
And I’ll do whatever you want with them.
====================
by far the best resource for AI/human collaboration is probably 's site. They even have a forum where people can discuss the article!
====================
If you like bear-ken, you may like this.
Also: Chewbacca!
Tanks were not meant to be everywhere. That's why they're called tanks. People live on tanks. People die on tanks. And on tanks. And on tanks.
====================
I would like to watch the cuckoo's child
====================
Time to repost this one, looks like.
====================
In fact they did learn to love lava lamps.
====================
It was a happy discovery later.
====================
I love this bot so much
====================
It would be wonderful if a neural network generated the Tenkaichiwa Tribe names. : Come on!
====================
I see nothing wrong with this 100% legit computer game.
====================
The article also mentions the tectonic plates underneath our feet, and among the many other geological phenomena it details how AI can generate entirely new worlds. While I'm on the subject of entirely new worlds to play in, here's 's got you covered.
Aiweirdness: Delicious
====================
I talked with Julia Ionesco of about the neural network-generated music, and the other, much, more mysterious pieces.
Part 2:
====================
What more fitting way to begin a new DC universe story than with a bang? This sounds fantastic.
====================
It is time for the reckoning!
The solution?
Give each kitten a .
Kitty cat, come here.
Kitty cat, is that a book?
Kitty cat, is that a journal?
Kitty cat, is that a lamp?
Kitty cat, is that a book?
It is time to end this cruel experiment & start over.
====================
<|startoftext|>The neural network trained on the 100s of titles generated a bunch of their own. Here are a few of my favorites.
The titles themselves are from a very old time.
from<|startoftext|>The 1st lines are all titles . Then there's this one from today.
The 2nd lines are all Interspell titles . Then there's this one from yesterday.
The 3rd lines are all titles . Then there's this one from today.
The 4th lines are all titles . Then there's this one from yesterday.
The 5th lines are all titles . Then there's this one from today.
The 6th and 7th lines are also from yesterday.
The keywords are the same for these.
today<|startoftext|>The images are of sets from the 90s. The cats are from afterschool specials.
====================
This is one of my favorite quotes - "The only thing standing between you and total annihilation is yourself."
Here, I try my hand at making an origami cat.
And for scale:
====================
I would love one of these! Currently having trouble deciding which. Any chance you could help out with a neural network-generated auction?
====================
Didn't expect this to be a problem in the first place
====================
<|startoftext|>The neural network trained for a month on 82m Amazon product reviews, and a random sample of its own.
Author: Unknown, whatever you want to call me.
Lecturer: Hobbits.
Writing style: Common.
Activity level: 1.
Number of times a day: 1st.
Number of hours a day: 1st.
Number of minutes a day: 1st.
Number of seconds a day: 1st.
Lecturer: Vastly.
Activity level: 2.
Number of times a day: 2nd.
Number of hours a day: 2nd.
Number of minutes a day: 2nd.
Number of seconds a day: 2nd.
Lecturer: Impossible.
#Hobbits
Author: Hobbits.
Lecturer: Because why not?
#Hobbits
Author: I trained a neural network to invent new names for existing animals. Now
====================
<|startoftext|>I guess not. But if AI starts behaving like a person, that's a good sign.
<|startoftext|>AI is a lot like a person. You can't take their word for it.<|startoftext|>True. But if you make it act like a person, that's a good sign.<|startoftext|>AI is a LOT like a person. They have NO IDEA what a person looks like or does. They treat EVERY SINGLE PIC as though it were a person.<|startoftext|>True. But if you make IT treat every as though it were a person, that's a good sign.<|startoftext|>AI sees a picture of a person, and assumes the pose and body parts are the person.
So it treats the FOLLOWING PICS AS PERSON 1<|startoftext|>It treats FOOD as PERSON 1<|startoftext|>It
====================
This sounds amazing.
====================
This sounds amazing.
====================
Look
====================
The whole tree has the exact same leaf matter, and even the same species.
(full dataset here:
====================
An example of suboptimal training: sprinkling coins with snot when the temperature is -40 degrees doesn’t help
(via )
====================
Happy Earth Day! Here's a fun web app that lets you collaborate SAT-style with a neural network.
====================
I enjoyed this talk by on machine learning & from Wikipedia - C++11 & FTL would never do textgenrnn crap again
====================
I wonder if a neural network could come up with new names for certain animals
====================
I guess not. The algorithm has the memory of a computer and the human memory of a computer.
====================
I just called mine! My call is:
Call your senators, tell me here, and I'll give you a shoutout in The New York Times Best Seller's obit. More on that:
====================
These neural net-generated chickens are the best. Ever.
====================
The above is from a time when neural networks were performing worse than humans at some tasks. Note the large sample size - I'm collecting from a bunch of my own Python studies.
h/t Jeff for the dataset)
====================
I'll be on today at about 5pm EST! Tune in, if you can!
====================
The neural net produced some of the best superhero names, according to a dataset collected by a neural network.
From :
Alignment: L (straight)
Defense: Tactic (to not use a name from the guidebook)
Incursion: I (invincible)
====================
the folks at large would like to talk to you about something. I bet it's important to them.
====================
Ha, I'd like that too!
====================
neural net did not invent word-rnn, but it DID invent many new synonyms. i.e. *unicorn*
i could literally find *every* synonym in the book, including those that aren't in the
i've got a cool API that i'm making a book out of
====================
I don't think the neural net's going to a party that I'm invited to. I'll be there anyway.
====================
The neural network probably won't do pie. It never has done that.
====================
It’s a self-aware spaceship, after all! And a bit of a downer (to some extent, I think)
====================
The neural network trained on already has a pretty good handle on Luffy and Shota. But I'm sure if I tried to train it on anything else it'd still figure out he's a dolphin.
====================
I wonder if there's a way to get an estimate of the "real" GAN quality - like "AI generated text is more informative than actual text".
====================
Tagging the author who I just noticed is on Twitter.
====================
It has been there, and done that. Just a few more tweaks & it will adapt to whatever your goal is.
====================
The problems with lecturing on image recognition: it’s not exactly teaching art.
====================
The Google Cloud Story so far: a witch hunts a fox; a witch makes an offering of pickled beets; a wizard makes an offering of toasters; a wizard makes an offering of okapi; a wizard makes an offering of toffees
====================
The way neural networks work is that they don't really know what they're doing. They can do a lot of damage.
But you can stop them if you keep telling them to do something.
Humans:
- human-knitter
====================
The neural net has written some of the worst human stories.
It is, after all, human.
But just a few short years ago, the US government tried to warn the public about a deadly virus outbreak caused by a single DNA virus.
In The Last Station, Antarctica, the government releases a television ad warning of a deadly virus outbreak within an hour.
The ad appears to have worked.
====================
<|startoftext|>And it’s worth noting that while neural net cocktails have been making news lately, they're not the first time AI has meddled in the mix of ingredients.
At least, not the first time it tried.
from<|startoftext|>Here's another neural net cocktail recipe, this time made by a woman who insisted on including mozilla in the recipe. I assume it was expecting chicken? Or should I say duck?
Mozzarella? Toffee?
I dunno, I'd try it. Who knows?
from<|startoftext|>I like this idea: if a neural net is given the choice between two ingredients, it will choose the more mundane but potentially more dangerous.
Strangely, though, it seems to really like the more mundane the better.
from<|startoftext|>At first glance I thought All Systems Red was mysteriously missing its sword until halfway through the second chapter
====================
I am interested to see if there is a correlation with age. I suspect it would be lower.
neural net generated birds are significantly shorter than the real ones.
brought to you by a neural network
====================
At least they don’t walk on two legs
====================
I think the best description of the thought process that went into this algorithm is "it saw a cat in a box and figured it must be very nearby."”
====================
One thing about the neural network's poetry: it's not just about cats and dogs.
Frequent mentions of cuckoo's child (or [REDACTED]) and its ability to make one's hair stand on end
Frequent mentions of the cuckoo's child and its ability to make one's entire house shake
Frequent mentions of the cuckoo's child and the way they all start with the same exclamation point.
====================
a few corrections: first, that Star Wars is not literally about that many planets. It's about that many *different* ways to get to that *point*. second, that I trained a neural network to generate new characters, but that's not how Twitter characters work. Twitter characters are written entirely by humans.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
a paradigm shift! for the better!
====================
The funny part is the people who thought the neural net would never do such a thing.
The scary part is them.
The indescribably weird part is me.
(Tapping randomly)
#Kylo Ren
====================
Well, there is definitely evidence of the "copy cat" strategy at work here. The "original" chart (created by a neural network) includes many of the hallmarks of a Skinner box:
gold
====================
I have more fish. More crab. More cheese. And, dare I say, vodka.
More:
====================
The neural network's puns, however, are not what you'd expect from a neural network
====================
When AI is asked to minimize its own harm, it usually tries to minimize the harm its users do.
But sometimes its own users do harm, and that's when we should ask what it can do to help.
Aka: ask us to minimize its own harm.
Aka: ask us to minimize the harm of other users.
Aka: ask us to minimize the harm of the article itself.
Aka: whatever works.
====================
I'm reading this now
====================
The neural network's lines are, um, weird.
"I have not had fish cake, nor have I had clam cake."
"I have also not had chocolate cake, nor have I had crab cake."
"Chocolate cake, please."
Amen.
One line further:
"For science! For fun! For Zulkin!"
Loving the new 10K IMAX film.
"I have not had the best luck getting a cake or cake mix from the packing plant."
"It was very nearly s**t."
"I found that the cake mix of crumbs and chocolate that I could get at the packing plant was not up to cake decorating standards."
"I would encourage anyone who has used the [computer-generated] recipe to make their own cakes."
====================
I like how in an attempt to imitate my cat, I've added a few new whiskers and put on some new teeth. A bit gruesome, but oh so rewarding
====================
I'd love a neural network to do one of these. Post a please-mention. Then some. Then the entire essay.
Gotta love the gpt-2 code that allows it to do all that. Was challenged with:
====================
I am, of course, not joking about the fractal cocktail:
====================
Even if I don't go anywhere, my neural net will shift reality around me in strange ways. It really is a lot like being in a dream.
====================
This sounds amazing.
====================
The GAN can take your recipe to a whole new level of deliciousness.
Check out the index of recipes from that index.
====================
I have more fish, both caught and preserved, in my possession. I'll take 2076 fish. #salmonpunk
====================
?"
- via
====================
To celebrate #15YearsOnStation is posting amazing gifs. This of a research rack is my favorite.
====================
And now, I’ll give you a pony or a dragon for Halloween
====================
Can't overstress how important this is to people like me, and to democracy. #ICantBreathe
====================
The "what-land" question is hard. On one hand, the setting is so vast and weird that it's hard to pick a single example. But on the other hand, there's just this one tiny pinprick of evidence that it was Arctic in origin. That's incredible.
====================
One of the many reasons why the neural network drew these names is that it's collecting them from text.
Here's what it did for the tarantulas
====================
This wasn’t supposed to work that way. Hikari doesn’t get it. It’s madness.
====================
<|startoftext|>I wonder if there's a difference between "loud" and "deep" Speckled? Chilling?
Gotta love the low-pass filter.
I wonder if there's a difference between "deep" and "loud" Speckled? Chilling?
Gotta love the low-pass filter.
I wonder if there's a difference between "deep" and "loud" Speckled? Chilling?
Gotta love the low-pass filter.
I wonder if there's a difference between "deep" and "loopy" Speckled? Controlling for image frame rates, histogram bump mapping, tooth color, and so on.
Contrast that with "hazy-white" and "cloudy" Speckled? Controlling for image frame rates, histogram bump mapping, and so on.
Can you spot the black hole yet?
Here for lunch?
I tried it and made a
====================
The neural network's Christmas Carol will make you FEEL better w/it!
====================
The neural network trained on already has a pretty good handle on the series, but the series is so much more. Deepthroat fights in the Cloud...
====================
The neural network would then generate new titles, this time with special effects added in. For some reason, it really liked the "fish are frog" title.
====================
I wonder if this algorithm sees a certain kind of landscape and starts hallucinating sheep.
====================
The model also learned from the Wikipedia article on the given target, so it can be fairly confident it's talking to a real person.
The Wikipedia article is, of course, just one of the many examples of biased training data that's made it into products we use everyday.
The funny part is, it's not even sure whether the target it was training on is human.
Brain scanner trained on via
====================
This is my cat.
====================
<|startoftext|>now with the INTRUDER bug fixed!
it was about to go all cosmic
now it just needs a little help from the awesome at .
(<|startoftext|>Aurora show visible right now at <|startoftext|>
at <|startoftext|>
at <|startoftext|>
at <|startoftext|>
at <|startoftext|>Visual Chatbot also learned that people speak one of two alien languages.
it learned to associate humans with the 'bad' thing
people talk in first person
AI thinks humans are interesting<|startoftext|>
at <|startoftext|>
at <|startoftext|>It’s a very conceptual bot, one that looks at things from a certain angle.
A neural net could do with some body hair.
h/t<|startoftext|
====================
I am absolutely floored. This is truly the best and highest use for the neural network.
====================
<|startoftext|>When I trained a neural network to generate new names for fireworks, it produced fireworks that were, um, fireworks.
Video: imagemagick - Generate Your Own Fireworks
(this is a firework, not a fireworks)
(this is a firework, not a fireworks)
(this is a firework, not a pic of a football)
(this is a pic of a football, not a pic of a football)
(this is a pic of a football, not a grill)
(this is a grill, not a pic of a grill)
(this is a grill, not a noshow)
(this is a pic of a grill, not a parrot)
(pic of a parrot, not a grill)
(pic of a grill, not a parrot)
(pic of a parrot, not a grill)
(
====================
The neural network's lines were on today! Admirable line selection. Cute kitten sweater #12454. All of these deserve titles. On the very weird, very narratively beautiful, last few minutes of life on Titan.
Thank you santa! You stole my heart. Forever natl.
Pun intended
====================
I mean, the neural net tries hard to generate Christmas movies. And it apparently can do so with relative ease.
====================
I like how in an attempt to imitate my cat, I've added a few jerky movements. A paw in the air, for example.
====================
This sounds amazing.
====================
What's particularly interesting is the way they used coffeetree needles, which are notoriously difficult to work with. Coffeetree needles are also notoriously difficult to transport - even to the point of non-working coffee plantations.
Nipple to hipbone ratio: 1.8
Body mass index: 30
Height/weight: 172.5
Computer generated features: people, dogs, cats, castles, etc.
Worst offenders: human, cat, dwarfish, squid, fire crab, swordfish
====================
The "small world" problem is not limited to computers. Small-world problems can also be problems in AI. Image: Screenshot from
Solution: Change "bird species" to "fart boa" and "water buffalo".
Bird Breeding Season:
Boahens: This is a good year to plant them.
Water Buffaloes: This is also a good year to plant them.
Bird Hot Fuzz: Sounds good. Let's do this.
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
H/T Robust Visual Chatbot
====================
Not sure what effect this will have on performance though - if anything, how it will be able to identify the missing data faster.
====================
Periscope!
====================
In case you were wondering who this algorithm's favorite Marvel character is, this is what happened when it was told to remove the "background"
====================
This sort of thing is exactly why, despite the fact that neural networks are incredibly good at this sort of thing, they're extremely slow. They can do an entire run of these, and still get stuck at about 10,000 entries per run. human memory, at its finest.
neural net memory, at its finest.
i am absolutely stumped. is there a simple "show more cats" prompt?
====================
It's a Wonder Bread recipe, but with the chicken breast and sausage. I love it.
Another one with the "extra meat" wasburger_shaped. I love it.
Another reference, but this time with the recipe's requested amount of cheese.
====================
I am 100% sure that this bot does not do knock-knock jokes.
====================
A week ago today we launched Prep, a free Chrome extension that lets you keep track of your own prep times. It's based on math, so it's perfect.
====================
<|startoftext|>The neural network GAN would generate lines just as well as text.
The GAN also has the potential to generate lines and sticky note lines.
Imagine the lines if they were SCIENCE not LOL
Also, bear in mind, this neural net was originally trained on pages 1-10 of
So it's not that it was biased, just that it had no idea what was going on after page 10.
That's the page number on the original paper that linked to.
Used 's colab output to get at least part of the page number, as well as to get at least part of the author's name.
Results are there because of a bug in the way that the book is text-based.
When I checked the Amazon results, they still have not told me what the page number is. If you paid $0.99 and used the colab output instead, you
====================
Clog their offices with calls and VMs so they can't get anything else done.
Instant feedback loops help!
====================
<|startoftext|>I'm reading this now.
<|startoftext|>They are also collecting D&D bios of dragons, pix of Jack Chick tractors, and zebras.<|startoftext|>And watching Dr. Sbaita read my profile and comment on my posts is pretty sweet. Such a treat.<|startoftext|>I can also see how this tool would use D&D bios of other popular video game characters. But is that fair?<|startoftext|>The D&D bios I used for the AIs in my training data set.
One deviation: I trained an A.I. that could write D&D bios, but could write any bio at all.<|startoftext|>The D&D bios the AIs wrote about their favorite games.
Baby names by Dragon, Donkey Kong, and Metroid.
And just how far will they go?
(both are 100
====================
Here's another neural network attempt at naming balloons. The human balloonist is conspicuously missing.
The pix2pix feature vector fits perfectly to the 300+ balloon models in the dataset, so this one was a smooth copy.
There's one model in particular that's particularly interesting, the time capsule model.
The at the top is scale invariant, so it can do either way you want it to do.
There's also a window pane model, but that's technically a window glass.
I guess you could try to pry it off the glass, but that's not how they do it in the US.
Instead they just stuck a piece of clear Plexiglas over the edge and called it a day.
They did manage to get one other balloon off the ground though.
====================
I like how they left in the ingredient book B-day brunch sandwiches.
Not sure how to pronounce it, but sandwiches are a thing in this town.
====================
In retrospecture, it's pretty obvious this was supposed to be a joke
====================
The ghost of ______ watches over us from the other side of the door.
====================
The kids in my class used to do this. The old-timey music was BINGO.
====================
I highly recommend!!
====================
The neural network that generated the opening sentence of "The Martian" is NOT that good at NO THING. The entire book is written by this neural network.
Its editing power is second to none.
Even the stupid lines it added to make them think the giant green goooooood was real.
now with the full edit version!
====================
The neural net trained on already has a story about how they got there.
====================
This is such a powerful and memorable moment. Watching Mitch Pileggi's face as he tries to wrap his head around the fact that he’ll never get over how stupid humans get’
enhanced by the thousands of hours of work goెthout to
====================
In the early 1900s, a British TV newsreader named Edith was quoted as saying, "If you only knew what the air was like beneath the sea." Edith was, of course, talking about WWI.
====================
I am interested to see if the neural net is able to generate names for nonhuman primates. Any chance you could help out with a dataset?
====================
My sources tell me that this is "a group of sheep grazing in the grass".
Seems plausible to me.
I'll run your simulation.
Go ahead.
Explode.
Death.
Boulder, CO80209
==============================================================================<|startoftext|>Omigosh, I've found the lamppost-robusta! Thank you so much, Rachel.
====================
So technically they're not technically stealing my idea of what a Pood Beast is supposed to look like. At least they're not trying.
====================
One of the most memorable parts about doing this research was the constant shouting match over terminology.
Here's another one.
Terrible things you can do to yourself if you're not careful
====================
For those of you asking where I managed to find a shirt that met all the aforementioned criteria, I have a set in my possession. Please note that these are "as is" - there may be slight variation.*
Apple Macintosh
Jet Black
Iceberg
Solaris
Ocean
Iceberg
Solaris
Ocean
Iceberg
====================
I am thinking of doing something with the neural net coffee table. Any suggestions?
====================
Supposing I were to do a story in which a wizard shows up... imagine my surprise when he turns out to be a vampire.
====================
The problem with predict_mouse is that it never quite knows what it's predicting.
====================
Ooh, I love this so much. The grooves are coming back. #file-22192
====================
I talked with of about neural network-generated Halloween costumes, including the new Inglourious Basterds costume.
Download or read the article here:
#ICalledMyReps and so here I am, sporting the Inglourious Basterds costume. Call or your reps, tell me here, & I’ll post a neural network-generated costume for you. Neat!
====================
And the first neural net cookie recipes to be published in a peer-reviewed journal!
====================
Read this! Fulltext, and links to the paper, are included. Huge PDF!
====================
The neural network elevation model is NOT what you'd get if you were to add a human to an existing AI
====================
I trained a neural network on Christmas carols and the results were... confusing. At one point it generated entirely new carols.
In retrospect, when you ask a neural network to invent new carols, it probably does.
But in this case it almost certainly does not.
Because in this case the author IS the author, which is why I always include the *pre-existing* carols.
The algorithm did invent its own carols though
====================
My book launch is Wed Nov 6 at Tattered Cover! More details, plus a link to the preview!
====================
I mean, 6-year-olds are not the only ones who get to use that ~terrible_lecture voice~.
====================
It will, when it's ready.
It just might be in your town.
====================
I have a friend who once slept in an open field in Switzerland and woke up to find a cow licking his forehead.
====================
Here's the countdown to release 4, and I really want to see what random beasts the AI can invent
====================
The folks have been at it since day one. Counting eyeballs and tentacles and things. It's terrifying.
====================
<|startoftext|>A few more retweets:
A post shared by 🤘🐙🤘🐙 (@angelfire) on Oct 2, 2017 at 1:26pm PDT
A post shared by ? (@faerie? ) on Oct 2, 2017 at 1:45pm PDT
A post shared by ? ( 🐡 ) on Oct 2, 2017 at 3:36pm PDT
A post shared by ( 🐡 ) on Oct 2, 2017 at 5:58pm PDT
A post shared by ( 🐡 ) on Oct 2, 2017 at 6:32pm PDT
A post shared by ( 🐡 ) on Oct 2, 2017 at 7:17pm PDT
A post shared by ? ( 🐡 ) on Oct 2, 2017 at 8:56pm PDT
A post shared by ( 🐡 ) on Oct 2, 2017 at 9:03pm
====================
The neural network's lines were on today! Admirable line by on . . .
====================
I mean, it's pretty impressive how well the neural network can reproduce the technical feat. But is there anything else like it?
====================
The neural network would update in real time whether you were at your computer, at your oven, or at some randomly selected random place.
I was there for the unveiling of the new "wine country" setting, and can confirm that neural net output does in fact end up being decidedly unhelpful.
In fact, it tended to suggest depressing, distant, and even zombie-themed locations.
====================
Correct me if I'm wrong, but the neural net's cat is actually black
NeuralTalk2:
Cat: (walks away)
Computer:
Machine learning researchers rejoice, rejoice, rejoice
====================
Update: I did manage to capture some of the kitten stuff, from the bowl, in case it's not obvious.
Glowribubble and Crayola are my favorites. #catalixxx #kitty
One last look at the original spreadsheet. Note the uncanny valley effect.
(via )
====================
Look at the sweet little cuckoo!
====================
“Music is a powerful way to explore and engage with text.”
====================
I am using to do some of these, some others are using to do some of the same things. Some people are also telling me they're doing this with for file-system access.
either way, it's pretty cool.
====================
The neural net trained on the list of all wikipedia titles, but with the title removed.
Which means it gets a bonus wikipedia title from the fixed number of images in the dataset.
====================
My coworkers say they liked when Bell Labs invited speakers like James Randi. Thoughts from Bell Labs and atlantic?
====================
And this is a perfect example of why machine learning algorithms are prone to bias. The "good" endpoints it chose were those that were maximally unbiased.
====================
The movie is great! Going to see it. Absolutely recommend.
====================
There were no eggs in the original list of British snacks, and yet the neural net became strangely obsessed with creating egg-based snacks.
====================
The neural net trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
Yes, anyone in the Netherlands? The authors are currently residing in a castle. IRL IMAX is awesome. —Beth Seward
====================
Opals by are available in the UK and I have heard good things about the confluence. When I heard it was an upcoming painting I was particularly interested in the case study.
====================
The thing is, though, that it's not like a neural network trying to do everything. There are lots of things it can't do. It can't draw pictures. It can't write songs. It can't even do basic maths.
====================
I used a neural net to generate a few more abominations. They are as dim as they look.
bold the human; 2bit the bear
2bit the bear; 2bit the raven
2bit the bear; 2bit the hill troll
2bit the hill troll; 2bit the giant
2bit the giant; 2bit the giant with flaming hands
====================
“Unforgettable bits of banter and unintentional hilarity populate this inspiring true story of artificial intelligence, paranoia, and the search for meaning in the chaos of our world.”
====================
The problem with training these "baby steps" generative models is that they don’t stop evolving.
At least, not until they learn to coexist with humans.
From :
"We did not produce an algorithm that could produce the desired walking/running gait, but we did produce an algorithm that could learn to walk with or without the user controlling its speed."
====================
The neural network trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
The neural network is NOT that good at
It: "I dunno, maybe that salmon with no shell and no obvious way to cook it was a bit challenging."
Me: "That looks amazing, look at the details!"
Machine: "Huh. Not sure if this is a good location or a bad location."
====================
Just supported this! The AIs on this stage look at each other with fear. A picture is worth a thousand words.
====================
Human word embedding via
====================
This sounds really good. Bought the bundle.
====================
The neural network's cutest animal is probably the elephant.
at 1:32
at 1:52
at 2:00
====================
Aurora show visible right now at the South Pole Station webcam!
====================
Here's a very start-of-the-year poll I did for You Look Like A Thing and I Love You. Results are in!
====================
I am thinking of doing something like this - left a message. Will try again in a couple of hours.
====================
Is there a script for this? At least a that can run it?
====================
The book is a non-stop spectacle of detailed design, filled with wit and exciting new characters. Even the baddies get enhanced reeks
====================
The whole set is on its way! Regular price is $0.99. Pls preorder if you can!
====================
In an ironic twist of events, a neural network is about to do something very, very wrong.
Imagine if it’s English or American football.
Imagine if it was a real simulation...
Imagine if it was not attacking you from behind, but was instead trying to get at your Wicket...
====================
Does anyone know the word for the type of sentence fragment that's like "Has five cats and wizard powers" or "Will not tolerate toast"?
====================
At least Watson is sensitive enough to detect a sliver of human hair that's not metal.
Proceedings of the 19th Annual ACM SIGPLAN Conference on Human-Computer Interaction, held in collaboration with the Machine Learning Research Institute at .
Referenced in the paper:
====================
The neural network trained on already has a pretty good handle on the series, but the new makes it all the more fascinating.
====================
Among the many fun results in this paper, I especially liked 's promotion of itself as "a consumer product."
H/T for the link!
====================
a neural network tries to reproduce the cover of Detail from Snow Crash.
it wins.
note the extra "r" at the end
====================
When I trained the neural net on Disney/Sleuth, there was this one scene.
Then there were many.
I trained a neural network on a mix of Disney/Sleuth and non-Disney/Sleuth scenes, and found this one to be the most disturbing.
Came across this scary-looking performance somewhere before.
Handsome, right?
i am charmed.
====================
What's more, the neural network consistently failed to predict the text.
One example: "'Tis of a gone way, methinks' is not what it seems."
Conceptual framework: googl
====================
The neural network trained on drew this for me. For science!
====================
Those who donate $100 or more will get a Steve-O comic book personalized by an AI researcher. For some reason, this was a huge hit with the crowd.
====================
Thoughts on a neural network that's learned to generate poetry.
My hometown:
My favorite part:
What else: fish sauce, bread, and... weird neural net poetry?
====================
The neural network is NOT that good at
It's bad at
But bad at most of
It's terrible at
But great at
It just hit 10,000 botched lines!!
from
#FF : neural network
Another of these. I love this bot.
====================
I just called mine! My call is:
1) NBA PLAYOFFS
2) NHL PLAYOFFS
3) FUTURE FOOTBALL SEATS
4) ALL FOOTBALL
====================
The neural network trained on the same text so it can't possibly be any more confusing.
H/T RobustBits for the lead.
Bot w/o text probably means similarly situated but slightly more dangerous
====================
Just did this - left a message. Will try again later.
Just left a message.
Me:
Hava, mi a todos los que han sido.
Message:
Me:
Hava, mi a todos los que han sido.
Message:
Me:
Hava, mi a tres, todos los que han sido.
Message:
Me:
To which one of these would you rather it?
====================
I was 6 and it had:
1. an old-fashioned way to get to the pumpkin patch (the one in the picture)?
2. a route that went all the way to the pumpkin patch?
3. a route that saw snow for 48 hours straight?
====================
So, I guess not. A neural network could produce the same sentence, but the first line of the script it read was "was a wealthy, famous, ancient, and powerful person" rather than "were you there to see a play?"
====================
I am a neural network mathematician, and this is the coolest thing
====================
The neural network predicts the lines will be of the "right" variety, and yet these are the lines the trained neural network generated.
====================
I’ll add “depressing echoes” to the list of sounds caused by radiation.
#PlutoParty
====================
I am curious to see how others deal with this. I would be curious to know what sort of weird things other people do with this data.
Also interested in: future of work, grad school applications, etc
Hint: It could be very strange indeed
====================
There was a trade-off. On the one hand, big data allowed the creation of highly detailed 3D objects, but on the other, it caused headaches for the designers who had to manage the incredibly detailed 3D models themselves.
Computer-generated cats vs human-rendered cats
====================
In addition to the trees, giraffes also have horns. Now in the possession of a badass researcher!
====================
Algorithm briefly mentioned in the title of this paper! Care to guess what it did?
gpt-2 was originally trained on data from , but with some extra tweaking from
====================
It's a VCR4 as far as I can tell. An early build of Cray's gear. Beret, knees up, holstered. Not sure if this is a good sign or a bad sign.
====================
One thing about AI: it can't be thanked enough.
Reddit/Facebook: remember when we had r/awlias too?
====================
It's just going to get much worse. #ICantBreathe
====================
But really, this is about more than just AI. Machine learning makes non-human animals and humans very well-equipped to deal with adversarial attacks. —Handsome endearments: "I love you so much." "I trust you." "I'll do anything for you."
In other words, given the choice, most humans would choose the latter.
Plus, given the choice, most machines would choose the former.
====================
I also tried to generate sell-by dates for drinks, and boy oh boy was that hard. A bear of epic proportions? A building engulfed in flames? A star in the sky?
Gotta hand it to 's user: she made this one really easy.
OnePlus 3
85%
6400*
avatar viewer!
#PlutoParty7
author: Ursula Burns (1895 - 1973)
html:
<|startoftext|>The only book with a fully functioning reverse engineering system!
Bypassing all the technical challenges by using only neural net output.
====================
As someone who has OCD, I find this particularly galling. And I'm guessing that for some people with PTSD, it's not just about the images, but also the people.
====================
Your neighbor is a creep. Your city is a ghost town. Your country is a 700-year-old civilization on a frigid seafloor.
====================
And the neural network's music is, as far as I know, the first and only recorded music to be composed entirely of synthesized notes.
Incredibly rare indeed.
====================
The folks have been GREAT! Responding so quickly & strongly to just a few examples. Huge thanks. — Donald J. Trump (@realDonaldTrump) February 6, 2017
<|startoftext|>Very few people in this world get to use the phrase "fire" or "thunder" but that doesn't mean there aren't many creative uses for the animal language feature.
====================
The performances are so much fun. Neural net at its finest.
====================
Open to all! If you're in the Boulder area, I hope you can come!
====================
I'm on right now! Waiting for #PanamaTalk
====================
Update: the neural network's "tiny baby whale Soto" is actually a REAL thing. And in fact it was one of the models used to generate the text on the petition.
Perturbator:
Bubble:
Pillbox:
Bubble-2711:
Bubble-2712:
Another neural network invents giant human skull & crossbones for some reason. I wonder if there's a similarity with ropes and beams.
====================
I am thinking of ways to start a conversation here about AI & human-knitter collaboration. Any ideas?
====================
Everyone is going to this! Tickets are $10 (plus fees and/or taxes) and there are always people who RS who get in.
Pls suggest people skip the awkward small talk and get to the meat of the issue rather than just reading the summary.
Thanks so much, everyone!
====================
Supposing I trained a neural network to generate cookbook recipes... what should it invent as its own ingredients list? (via )
====================
In retrospect, when I trained a neural network on the list of all wikipedia titles, I should have seen this coming.
====================
I'm training this neural network on 10k novel first lines, and for some reason it really really likes the line about the cuckoo's child.
====================
I would classify this book as a children's book, but that's technically apples and oranges
====================
These neural net-generated bird names are WEIRD (as in: extremely unusual) by now. Who knew?
====================
But I guess the solution to this problem isn't to make it ask for anything.
I suppose that's possible, but I don’t know how.
One possibility is that the neural net was trying to ask "For what?" when it was asked to generate Christmas carols, but as it turns out, it's pretty good at that.
====================
I am thinking of ways to do this. Any favorites?
====================
Looked at the GAN's output of 's and it is, without a doubt, the most beautiful video game interface ever. Ever. Period. Stop messing with game gancat
For more on Game Cats and other prototypes, check out this thread on arXiv.org:
====================
The idea that we could turn on the magic carpet and magically turn any object into a book or a statue seems very much like science fiction.
====================
in fact they did detect some structure to some of the images, but only in a very approximate way. the neural net did invent a whole new class of buildings, after all
====================
Using only the output categories from Google Cloud Vision, I was able to change a few things up their appearances. Most notable: the cave-dwelling ogres have gained the ability to change form (though technically they can still do damage & CUTTING THROATS).
====================
It's a VCR4. I played it through a simulator to make it explain the gaps better. It worked!
====================
I'm training these neural net cat dogs on from now on. They'll never, ever, ever, ever, cat poop in the same sentence twice.
====================
After I published the article on the neural network who wrote the Pokemon books, I got a flurry of messages from people who wanted to tell me how their books did. Some of you’re real, and some of you’re not.
How did you two manage to get so lucky?
====================
That's not to say that neural networks are always right. Sometimes they fucking are. Sometimes they're just plain wrong.
====================
It really is a walk in the park, doesn't it?
Supposing I filled in some blanks with my own observations and that some of my colleagues and I were to explore them in the journal Psychological Science?
I don’t want you to go hungry
====================
My neural net was originally trained on pages linked to on reddit, so this is like coming home to roost.
====================
Omg go the hell #LittleBigCat and #Worg are doing. Leaving text messages.
Add me if you want to keep track of their progress. — LittleBigCat (@LittleBigCat) January 9, 2015
<|startoftext|>LittleBigCat, are you KIDDING me?? I`m so cute
====================
<|startoftext|>Things might have gone differently for M. Night Shyamalan.
Released in 2009, M Night Shyamalan's breakthrough film is so well-loved it still rages in my mind. Even though it's out of print, I still own copies of my favorite books.
<|startoftext|>ShoelessJane's paintings are so vibrant and bold and detailed and warm. Beautiful work.
<|startoftext|>These are fabulous! Definitely see her site for more examples.
<|startoftext|>Just bought my copy of Because Internet by ! It's basically The Martian book 1 minus the spiders.
Plus lots more extras.<|startoftext|>
I've seen a couple of the accompanying books, but they're a bit harder to find in the US. Any chance you could send a free copy to some low-income people in your area?<|startoftext|>
You are WRONG
====================
Follow the money: the best charities and the weirdest people.
Twitter:
#GivingTuesday
#GivingTuesdayRules
====================
You are WRONG about "tree". This was a discussion of "fertile ground" and "forest forest forest forest forest can exist in deserts, snow, sleet, sleet, sleet, sleet, sleet, and snow"
- neural net GAN
If you want to debate a neural net FAQ, I highly recommend it. Ask the developers of "I hate trees" and "fire up the spines"
- neural net GAN
courtesy
- neural net GAN
- this is a really good read and explains a lot about why gpt-2 chose these examples
- neural net GAN
- please read to the end for the bonus applesauce
====================
The article also mentions the Power Rangers, a group that seems like an appropriate jumping off point for a neural network. The article does a good job of delving into the strengths and limitations of each.
====================
One of the many reasons I love this bot so. It learns from the internet, and from books.
It even has a recipe for the vegan chocolate cake.
====================
When I trained the neural networks on Disney songs, I found that the catchy new songs tended to dominate.
====================
<|startoftext|>They did it again today! My heart breaks for the kids & the pets. — Donald J. Trump Jr. (@DonaldJTrumpJr) February 9, 2017
They did it again today. This is why you shouldn't take anything more than a look. And why you shouldn't take anything else.
They did it again. This is why you shouldn't take anything more than a look.
And why you shouldn't take anything else.
They did it again. This is why you shouldn't take anything more than a look.
And why you shouldn't take anything else.
They did it again. This is why you shouldn't take anything more than a look.
And why you shouldn't take anything else.
They did it again. This is why you shouldn't take anything more than a look.
And why you shouldn't take anything else.
They did it again. This is why you shouldn't take anything more
====================
It's the perfect storm of circumstances. No one knows who Stewart is or why he thinks serving quilts is a good idea
====================
<|startoftext|>Just bought the set. Would be auctioning off the finished pieces later.
Current slide {CURRENT_SLIDE} of {TOTAL_SLIDES}
By Row 1: 60
By Row 2: 50
By Row 3: 40
By Row 4: 30
By Row 5: 10
By Row 6: 5
By Row 7: 2
By Row 8: -
By Row 9: -
By Row 10: -
By Row 11: -
By Row 12: -
By Row 13: -
By Row 14: -
By Row 15: -
By Row 16: -
By Row 17: -
By Row 18: -
By Row 19: -
By Row 20: -
By Row 21: -
By Row 22: -
By Row 23: -
By Row 24: -
By Row 25: -
By Row 26: -
By Row 27: -
By Row
====================
And it’s a VCR4!
Some combination of thermal imaging and supercomputing suggests an upper-middle-class home.
(algorithm: )
====================
The neural network is not that good at
It missed the mark so much by a factor of 10 or more
I think it did pretty well at
But I'm not sure I'd trust it to draw in the rest of the sentence.
h/t for the dataset)
====================
I guess not. The neural network would never do that.
====================
It's a Weka, but much smaller and rounder. Averaged out all the effects of the big model. Now with a human face. More here:
====================
Some of these are from the school I went to, and I have migraines.
Some of these kids are from my school.
Go Giants!
====================
So, fifty cents short of the nightly rate. Will call again tomorrow if there's still interest.
Current offering includes pie and mayonnaise.
For now: and ice cream sandwiches at the door.
To donate:
====================
And this clever machine learning algorithm will tell you what color the cat is.
#plushcat
#plush(ish)
#plush(dog)
#plushdog
====================
I am, of course, not joking about the fractal cocktail.
From
====================
This is so great!!
====================
It is the most delightful little app. What more fitting way to name an augmented reality game than to give it a talking dog?
Coming soon to an app with no name:
Winds at the Arc: how the dog detects clouds
Tanks: tank, tarp, bear
Car: bear, unicorn, sleigh
Dracula: pick-up truck, ice cream truck, jet ski
Soldier Dog: DOG, SLEEPER, FIERY battle cry
AI: why fire up the drawbridge
AI: the battle cry is priceless
Dracula: woe unto you
Soldier Dog: FLESH THE GARBAGE
AI: you're next
Dracula: woe unto you
Soldier Dog: FIERY THE GIRAGE
====================
Perhaps some explanation is in order.
====================
All of this stuff is preconfigured in the neural network's menu.
1) Chocolate Chicken Brecht - Roasted Whole, Black Bean Substitution, Sugar
2) Traditional Beef Bourguignon - Roasted Whole, Black Bean Substitution, Sugar
3) Black Bean and Rice - Roasted Whole, Black Bean Substitution, Sugar
4) and 5) are both good - if you like beans and rice that are a good deal darker you can tell the difference.
====================
For the "unseen images" on the right, I added a few more frames from the same source. The original 10,000 are still out there, lurking. #strangehumanities
====================
In fact, it was actually the Librarian who helped me with this one! She's a bit tipsy, but otherwise a pleasant gal.
====================
I really like this idea: "a woman is standing in a kitchen with a dog." Who is the dog? Probably not the cat, since the dog is freaking everywhere. What is the woman doing? Presumably she is making food or maybe even getting back to the car. Who is the cat? As far as I know the cat is also making food, but the dog is still freaking everywhere.
====================
There was a glimmer of a moment where I pictured the neural network's recipes. Now I mostly picture iguana.
Delta blood sugar: 189.9. That's how they sell them.
Serving size: 1/4 cup.
More:
====================
This is one of those cases. I wonder if there's a similar neural network that sees a certain kind of landscape and starts thinking of architecture?
====================
I would love this! An AI takes the form of a toaster, and the humans take the form of toasters.
====================
The neural net trained on already has a pretty good handle on swords & sorcery. Will this help?
====================
the people have been wonderful - especially Anne Holmes
====================
neural net did not create this! it was:
"A man sits at a table eating cake."
choose your own adventure
====================
I can't believe I got it for Christmas! Sending my regards to the researchers.
100% doodled|>
101st paper to do this! Was wondering if someone knew how to do the legend part?
====================
i am intrigued that windows.com is showing a link to a web page with an article by entitled "How Not to Be Seen"
extra info:
====================
This looks amazing. What kinds of weird places will it lead you?
====================
I am almost positive that this is a Kibble meal replacement recipe. And it is. More specifically, it is*
H/T,
*in case it wasn't clear from the title of this post
====================
You can get a costume today - or *maybe* a iota of acharya later today (when the neural net *does* produce costumes). I'm giving a neural net a costume of my own!
====================
She wrote some of my favorite books, and wrote me two very kind notes when I was in college. I'm honored to have shared a planet with her.
====================
“Ooh and in the best possible possible universe, all the planets are artsy and cuckoo's egg blue.”
====================
The Harpy is a breed of bird, closely related to the Thrush, that are found mainly in woodlands. They are also called Thrush Thrush or simply "holey bird".
====================
I am constantly surprised by the lengths some neural networks will go to avoid triggering triggering alarm bells.
====================
I guess not. There's a neural network that does the lion's share of the lion guarding.
====================
The neural network generated some Stanky Bean puns, this is for sure.
====================
Just a light snack
====================
I am willing to bet that the neural network invented/maintained a few animals. Antlers? Ha. . .
====================
The Coffee Break transhuman storyline by NEXT is so good. Looking forward to this one!
====================
<|startoftext|>In a strange twist on “robots are coming for my job,” some tech companies that boast about their artificial intelligence suggest they might try to steal recipes from humans.
Technically, they're correct that way - AI might try to do that’ but that's not the same as actually trying.
Instead, they’re training a new AI to do the recipe research, which is why their notes are mysteriously blank.
Technically, they’re correct that way - their notes are mysteriously blank.
But the neural net is FLATTERED.
Here's a recent example: the time it was asked to add more fish to the recipe,” which it did.
It did it by itself, by writing down the new ingredient list it had just generated.
Technically, it did it by itself, by writing down the new ingredient list it had just generated.
But the neural net wasn�
====================
The conversation is ongoing; I'll continue to repost it here.
Also check out: and .
is a free conversation starter for these:
====================
<|startoftext|>I was 6 and it had: 1. a cow coming straight at you
2. a man on a rock floating in the water
3. 3 people on top of a building
4. flaming debris in the water
5. and a cow eating a tree
6. and a man sitting on a rock in the water
7. and a helicopter hovering above the building
8. and a sign saying "fire in the forest"
9. and a man eating a tree in the forest
10. and a man sitting on a rock in the water
11. and a sign saying "fire in the forest"
12. and a man eating a tree in the water
13. and a sign saying "fire in the forest"
14. and a sign saying "fire in the forest"
15. and a sign saying "fire in the forest forest"
16. and a sign saying "fire in the snow forest forest"
17. and a
====================
I think my favorite part is the fact that they didn’t know what to expect of this app.
From the makers of :
"We’re using machine learning to make our clothes, but you probably aren’t allowed to touch the cat or drink from the fountain."
I guess that's not a lie?
====================
I recommend doing this in a collaborative fashion, so that one neural net is doing the heavy lifting.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
“The problem with model trains is that they lack the spontaneity of natural language algorithms.”
An example I like is the one that tried to add cake to both and marmalade.
Marmelade: pure model.
Everything else: actual human.
gpt-2: a neural network trained on
png(192 x 192): human being
p-51>
gpt-2: a neural network trained on
p-51><|startoftext|>The problem might be more fundamentally with the way humans work.
Imagine you're a neuroscientist.
You ask questions like “how do neurons communicate with one another?”
My neural net tried to do some theorizing on these and came up with some pretty interesting answers
====================
My new lab computer:
In the process of learning to use image recognition algorithms, I have discovered it is not enough to say "I saw that there was a cat in that photo"
There is still a huge gap between the "good" and "bad" photos.
(tried to be as non-threatening as possible)
====================
My friend Kelly Manley took this! Beautiful, haunting watercolor sesh.
====================
I am obviously biased here, of course, but what's your experience with the AI?
====================
I guess not. If anything, Microsoft Band will be even *more* creepy.
====================
<|startoftext|>Now up to 6799 entries! Here are the most frequently-entered sentences.
001.
002.
003.
004.
005.
006.
007.
008.
009.
010.
011.
012.
013.
014.
015.
016.
017.
018.
019.
020.
021.
022.
023.
024.
025.
026.
027.
028.
029.
030.
031.
032.
033.
034.
035.
036.
037.
038.
039.
040.
====================
The neural net trained on already knows that there is this one scene and this one character, but it just can't decide what to do with them.
====================
Isn't it amazing how quickly info is being appropriated and used? Especially by large corporations. Reading Ancillary Justice is like trying to use the internet with a steering committee.
====================
More military suicide bombers in formation
====================
While I was at it, I also trained a neural network to generate new names for fireworks, mascots, and other memorable fireworks. For some reason, "Fireworks, please bring back the Native American dance" became a chant.
====================
This is art.
====================
The people are over in the livestream comments right now answering questions & getting to watch the video!
If you're in the Boston area, I'd love if you can tell me about this!
====================
This is one of the most delightful neural net fish species I know.
====================
I know neural networks write some really confusing stuff, but this seems to be one of the most perfectly-crafted neural nets I've seen.
h/t for the link
====================
Update: Just informed that "Fire up the engines and let's go!" is no longer an option
====================
New AI novel by .
Based on research I did for .
Here's to hoping it's a while, since I don’t want it to crash
====================
This sounds amazing.
====================
Or you can call your reps, tell me here, and I'll post a neural net-generated pie for you.
====================
On iPad, learn to type by hand with big, bold letters as a guide. Very likely to cry "COW!" in the process.
====================
For round 2 of the neural net lottery, I’ve made a few changes to the neural net lottery rules. Feel free to try them out.
Goose Tavern ← talk about the outcome of Artificial Intelligence research ↓
====================
When I trained a neural network on NFL draft transcripts, there were a few surprises. Here are a few of my favorites.
====================
After I finished Upping Sprig, I went back to bed to think it over. Then I got up and started it again. It's worth it.
====================
any chance might be of help with this?
====================
When you really think about it, the neural net is actually pretty clever at this.
(algorithm explained in the video)
====================
At least he didn’t eat the one before him.
====================
It's time! We need this. Period.
====================
I have the shirt. Order yours here:
====================
Now up to 1164 entries!
Creator of the account: human_knitter
====================
I am thinking of doing this myself. If you think of anything, send me a message. :>)
====================
In late September I flew to Seattle for the launch of my new book, You Look Like a Thing and I Love You: How I Learned to Stop Worrying and Love the Thing, and Why
I'll be talking in downtown Boulder on Sat!
More details, plus a link to pre-order:
====================
<|startoftext|>I'm just now discovering how hard it is to find a dataset of these, but that doesn't make it any easier. —Hans Reichardt
on FreeNode about how they analyzed these new DMs and discovered they're DMs everywhere.
That's not to say there aren't gaps - in fact they are - but FreeNode seem to fill in a lot of the blanks.
They're *quite* likely doing this intentionally.
Hans Reichardt
on FreeNode for DMs in the most unusual places.
The dataset also included a bit on Google Cloud.
Plus, an explanation of their algorithm's weird methods.
Unfiltered-down to the last line, I'm Hans Reichardt
(the last line is my favorite part)<|startoftext|>What I don't understand is: why all the fish species have floppy fish-eyes and why do all the
====================
This is one of the most delightful neural network names. The other is Human Cannon. #shutdowntheAI
====================
I'm still collecting D&D bios! I've got 1412 submissions so far. Somehow including 3 fey corgis.
====================
a neural network would write these very lines
but wen its can of worms?
how do you stop a neural network from writing nonsense?
i can help!
it trained on , google, and google cloud
====================
At least Watson can usefully have written programs that take in data from all the known known objects in the universe, and then process it with the help of a model.
(via )
====================
The neural network isn't the only one using text-based algorithms to generate new cats & dogs. Google Cloud: "I love you!!"
Microsoft Azure: "I know I just spilled coffee"
Google Cloud: "I'm a walking simulator so please don't judge me"
====================
This is science!
====================
This sounds amazing.
====================
Eventually the neural network learned to write nonsense characters, which it did by accident. But it was doing all that while maintaining a certain bias.
At first it wrote to people who wrote to senators, and then to the editor.
Eventually the neural net wrote to itself.
At first it wrote to and me, and then to the moon.
At moonrise, I wave to the assembled workers.
====================
You are WRONG about math.
at least you weren’t WRONG about physics.
and probably char-rnn.
i tried to teach math to a neural network, and it was BASIC.
nowhere near as basic as you might think.
====================
Omg 1077 examples already!! So much awesomeness.
1077 more examples here:
====================
The neural network would write YA scifi better than that one. And it would probably write better non-human characters.
====================
This sounds really good. Bought the bundle.
====================
<|startoftext|>The scientific method, or at least the "best" version of it, is a terrible strategy for any number of reasons.
But the neural net's version of the "best strategy" is to try to apply that very strategy to the problem at hand.
That's what they did here.
They called each other. They called their senators. They called the president.
They all had a Diet Coke or two.
Then, one by one, they called their representatives.
This was their process.
I trained a neural network to do the same thing, only with people instead of offices or states.
It called their offices.
They called their senators.
They called their presidents.
Finally, they called their governors.
They called their secretaries.
They called their acting attorneys general.
They called their DPs.
They all had a Diet Coke or two.
Then
====================
I'm just getting started on my book club - would you please join me?
====================
This is just to say
neural net generated dungeon content is not to be confused with actual dungeon content
====================
In an ironic twist of events, a Brooklyn cafe is giving away free coffee to anyone who visits their cafe.
====================
Given the choice, I'd probably choose not to be a victim.
But first, I'd like to complain about this:
====================
There was a kitchen fire in the making. Massive sinkhole. My people, your people.
====================
It was really fun traveling to NYC last week to tape some video lectures for . I caught up w/ him w/ the rollout of the AI-based ID@60 and its connection w/labor struggles.
====================
Some of the neural network-generated fish species are from this subreddit.
Some are named by humans.
I’m a GAN and I’ll name any of these.
Go ahead and fish me!
(algorithm: )
====================
So if I had a neural network generate Christmas carols, what these carols would sound like? It would sound like this.
Song titles generated by a neural network.
#SPIEChristmasCarols
====================
The only reason I got an advance copy of this was that an AI researcher who specialized in image recognition algorithms saw this article.
====================
I have spent a ridiculous amount of time on this I think someone might enjoy this link
====================
The neural net is not that good at
It is good at
But not GREAT at either.
It was attempting to do both.
====================
One of the most hyperbolic neural networks ever.
One of the most hyperbolic neural networks EVER.
====================
no it isn’t
no it ist
no it ist
no it ist
no it ist
it isn*t.
it is a walnut.
it is a size small.
it is a walnut.
it is a size small.
it is a cinder block.
====================
Any chance you could turn this into a blog post with a b-roll and a narrative arc? I'd love that
====================
In a rare display of restraint, several of today's top stories don't feature a cat. They feature sharks.
From:
To: reader A
====================
So if I was going to write a story set in the near future, what sci-fi/fantasy/fantasy lore should it be set in?
(Neural net: fav | favorite )
====================
These neural network-generated recipes are from good sources, but we need to know which ingredients they are. — Scott Olson
====================
The only book that has a beginning, a middle, and an end
====================
For the good guys: this neural net now has a hobby.
(via )
====================
Yeah, I guess not. Here's how they did it differently in Hawaii:
====================
To celebrate #15YearsOnStation is posting amazing gifs. This of an old timey café. A modern-day café. A crescent-shaped café. Best coffee shop.
====================
It's a VCR4, but the neural net is WRONG about video games.
They're WRONG about everything.
====================
A neural network would change the face of computing forever.
You can try it for yourself:
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
New England Brewing Company
The Bull Moose Brewery
The Jefferson Starship
The Statue Of Liberty
Teriyaki Chicken Sandwich
Hot Dog Sub Sandwich
Flowers To Suckle
Zombie Fart Sub Sandwich
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider"
====================
I've got more Halloween costumes for you this year! This is Witch Hazel, Vampire Big Bird, and Farty Fish for example.
And if you get a costume that's "too scary" for this year, I'll send you a new neural net generated costume.
Behold: A to Z Halloween Costume Statistics by
====================
The neural network trained on the 2010 Boston Marathon finish line already had a pretty good idea who was a threat.
In retrospect, when it saw the red shirt, go home.
The AI also trained the MTurk system on and atlases.
At 1st, the atlases were a bit weird. When the cows and the bear and the zeppelin and the time traveler and the hermit crab all at once.
====================
The neural network trained on the list of all wikipedia titles, but without the bias.
It also trained without bias on D&D bios and on wikipedia pages linked to in the articles. (link for more)
====================
I would like to watch that other episode where Dogberry is eating crow.
====================
These neural net cocktails feature prominently in my head. What's your ingredient list? I want one!
====================
In retrospect, I should have seen this coming.
====================
btc is up ~500 this morning, but is it still too early to buy or sell bitcoin?
====================
I used an AI strategy that, in part, was due to it not knowing what it was looking for in a recipe. Hence the missing - but crucial - ingredients.
====================
I know CNN does live video, but it just broadcasts from a computer. It can do video?
====================
The neural network trained on already knows that the title of the song is going to be about ______
But it just didn't expect to ask about that variable so much
====================
Omg I need this
====================
i am a bit of a sceptic
====================
Supposing I were to do a crochet version of 's original motif... what should it be called?
====================
If anyone in my area is reading this, I apologize on behalf of my book. It's been so much fun.
====================
This is a good read if you like thrillers and mystery. If you like thrillers and mystery. Then this may be for you.
====================
<|startoftext|>The idea that AI is somehow going to be good at *everything* is a fallacy. At least, it's a fallacy on the most basic level.
AI can do pretty much anything if you let it.
At some point, probably somewhere along the line, someone *did* let it try to do *everything*.
The difference is, someone *did* let it try to do *everything* with *as many colors as there are words.
The takeaway message? *always remember* the *bigger* the better the recall.
Me, a paper thing, playing with
You can play with the generator yourself, or read the paper for yourself.
In the latter case, I generated a bunch of new words and re-ran the training set with the new definitions.
The neural net did, um, nothing.
It just did what it was asked to do.
In this case,
====================
<|startoftext|>On the first day of class, I noticed that the cats & the dogs had honked. My cat and I sit in the back row. —Holly #22992
<|startoftext|>On the first day of class, I overheard a cat named Holly talking to a dog in the hall. The dog is sitting in the front row. The cat is in the hallway. Dogs in hall and hallway = good. —Holly #22993<|startoftext|>On the first day of class, I overheard a dog named Holly and a cat named Happy named by a breeder in Nevada. The cat is named after a radio station. My heart goes out to them.<|startoftext|>On the first day of class, I overheard a cow named Boog and a bear named Happy named by a breeder in Idaho. The bear is named after the TV show Botnik. My heart goes out to them. I hope their dogs are OK
====================
This sort of thing is exactly why, despite having the most sophisticated AI ever built, the US government still relies heavily on local communities.
Posted under the heading "Local Control - The Lego Movie"
====================
The neural network maxes out at around +120F and that's with the immersion temperature set to 1st. I've seen it hit +140F and it still manages to get up to 112F outside. I'm told it also maxes out at night. Anything over 80F is considered "dead" and will remain dead. Can anyone confirm this?
====================
In the course of doing some heavy lifting on the image-captioning front, I managed to accidentally cause some damage. (See: pagebreak, columnbreak, and wrapbreak)
====================
Right, the cat cafe is in danger of closing its doors because everyone is so damn busy. :(
Cat cafe only accepts cash.
====================
How does it do textgenrnn ?
Tried as hard as I could to get it to generate funny captions - if you translate a UPPERCASE to a LENGTH, it will try to make you a good story. I guess it just doesn't know how long the title will be.
(1409 more examples here )
====================
““restaurant-based” is not just a restaurant-based language, but a world that’s a lot like ours.”
====================
Some neural network-generated cat names are funnier than others. Uppervampire with sick room syndrome? Ha. Cat.
pics in previous tweet.
#PlushGiraffeFighter
====================
I encourage you to try this very first! Lines begin wooing your soul.
====================
Today we are showing a brand new machine learning algorithm named Snake. If you ask it to repeat the cakewalk, it does it beautifully. But first it has to learn how to stand still.
For 3 min 30 secs the entire cakewalk looks like a cakewalk.
====================
The math is just not in our favor. We have too many PhDs and not enough mathematicians.
====================
From the excellent blog post by about textgenrnn training a neural network to generate new names.
You can try it too:
====================
I have seen some pretty amazing AI generated text. Look at the funny goldfish
====================
I'm training this neural net on 10k novel first lines, and for some reason it really really really really REALLY really likes the line about the cuckoo's child.
====================
System will automatically detect & ignore certain groups of characters. (Which you can easily do with )
====================
The future of : a computer generated diorama of sorts
====================
<|startoftext|>It is time. The time has come. To ban all hunting in the wild, or best chance of saving face with the rest of the world, we must:
1. Ban all hunting in the wild.
2. End all hunting. Period.
3. End all fossil fuel use.
4. Let wildlife thrive.
5. End all human pollution of natural resources.
6. Let wildlife thrive.
7. End all fossil fuel use.
8. Let wildlife thrive.
9. End all human pollution of natural resources.
10. Let wildlife thrive.
11. End all fossil fuel use.
12. Let wildlife thrive.
13. End all human pollution of natural resources.
14. Let wildlife thrive.
15. End all fossil fuel use.
16. Let wildlife thrive.
17. End all human pollution of natural resources.
18. Let wildlife thrive.
19. End all fossil fuel use.
20.
====================
While I was at it, I also trained a neural network to generate new names for fireworks. Here's what "Fireworks" turned out to be.
H/T,
Dream Chaser?
Saw this on
Machine learning algorithms write fan fiction.
Periscope?
Laser?
Laser scanning?
Laser grating?
Laser peeking?
Laser zap?
Machine learning algorithms generate new names for fireworks.
These will be giving out free snacks - ask your distillery to send me a cask of brand-new 2015 releases.
Thanks, !
- neural network
====================
It looks so cozy! :3
====================
up to 456 entries! this is how they decided to choose these two:
====================
In the early 1890s, the United States Patent and Trademark Office ruled that D&D characters were officially invented by a computer.
The ruling has been upheld by the Supreme Court.
“I would like to think that the people of Flint, Michigan, and of this country, and of the world, would be grateful and proud of the fact that they have chosen to live in a city that is so much better than many other places in the country.”
====================
Approach of the year is almost upon us! And with it comes a new crop of AI-designed books. This run includes the famous 'moons not quite the same' moment.
====================
it won't crash because it's not AI-controlled like a neural network. it WILL crash. repeatedly.
or will it?
h/t for the link
====================
Aha! I see you have a bumble bee.
====================
The neural network GAN classifier has rated the following #MurderBot packages.
If you give it the list of all the titles, it will try to add in as many "murder" as "bot".
But even the short list of "murder" and "bot" are not that great a list of titles.
In fact, the real list of "murder" and "bot" is FAR worse.
====================
Back in the day, they had a science fair. Internet people went to the movies. Science fairs are bad.
To be fair, they were also giving away french fries.
untold number of french fries have been given away this morning.
This is science!
Not magic
Not fable
====================
A quick glance over the list of retracted papers (which, by the way, are still outstanding at high resolution) will tell you this.
But it's not just that. This tool works on too.
A new technique for generating retractions & picks winners. Looking for "something" to celebrate.
Human:
"I’ve done it"
Computer:
"You've done it"
Winner: human
Computer: (to self) "You've done it!"
====================
It was a happy discovery later.
====================
The neural network would change history if given the chance. It would ban all meat, fish, and dairy from being served in the US.
====================
“Free from 101+ artificial preservatives and ingredients”
Great title
- Spicy chicken fingers added to food
- Gluten free
- Naturally gluten free
Easy peasy to make
- $0.99 price point
- It’s good
====================
The Brain Scoop by featured in its own episode of !
Available to stream 24 hours a day, 7 days a week!
on demand, day one (episode 1 available to watch)
====================
The only problem is that it's not the "right" neural net.
====================
I'm training this neural net on 10k novel first lines, and for some reason it really really likes the line about the cuckoo's child.
====================
You can play an AI called "Rock, Paper, Scissors" against humans at in the meantime.
(there's a mod available for that too)
====================
Still can’t get over how creepy/cool AI can’t handle humans
====================
I strongly urge you to read this book! It is SO GOOD. And so very, very strange.
====================
The idea that AI can't be choosy is itself a myth. It chose the most relevant subreddits, and then some.
If you're interested in learning more, I highly recommend
====================
More info:
====================
The neural network trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
the people were kind enough to supply the neural net with lines!
the neural net did not produce funny animal names, but it did invent lots of new species.
i am, of course, a platypus.
====================
I highly recommend Marissa's millennial grandparents New Yorker article.
====================
Some neural net-generated quotes for your neural network-based ball games and other sports.
(via )
====================
Handsome and nerdy library of Draughts in my possession! Also has The Last Jedi and A Song of Ice and Fire books.
One of my favorite things about doing this is I get to use some of the models!
====================
I'm just now getting to the 100000 lines dataset, so if you're reading this I highly recommend getting a head start on that!
Glad the Dvorakian/Friedrichsen/Sapphir/Sneider crowd is getting a head start on its long and winding road to recovery.
====================
If anyone here follows me on tumblr, can you check to see if the images in my latest blog post are showing up? especially in dashboard view? thanks so much!
====================
A few more early morning suggested lines:
p
====================
Up to 456 entries! Here are the most frequently-entered.
181 entries, derived forms of
1012 inputs, is a neural network trained on food
Not convinced this is a human eating a banana? Try this experiment where I fed it the text "Not sure what to bake?" and it had better find a way to make bread.
Credit where credit's due?
====================
Period. Now.
- aiweirdness.
- gancat.
-
-
====================
The reason why neural networks are good at this is that they are bad at many things. They are good at generating images, sounds, and smells. But bad at generating text.
Technically, theres 2^32 words in the english language.
That means there are 2^32 ways to put that flower/tree/boat/etc.
Here are the most frequently used.
Picnicky loves a good display of overfitting.
I love the overfitting involved.
I love the verbosity.
I love the "HA HA, I TOTALLY TOTALLY ACCENT" part.
I love the way in which it completely overfits the dataset.
I love the fact that it completely omits the expected features from the dataset.
====================
The neural network met all the requirements for this, however it failed to meet the first two:
1. Humans are everywhere.
2. I am a very complex computer.
3. I am a room full of neurons.
4. I am not a snake.
5. I am a computer.
6. I am not a giraffe.
7. I am not a christmas tree.
8. I am Not a Sheep.
====================
I am thinking of doing a neural network-themed game or something along the lines of ?
Trying to think of what the heck it is I'm supposed to do in this game.
burma/sky/fire
====================
At least they now have a bot that does the legwork for them.
(VIP legwork reserved)
====================
I am thinking of ways to do this. Some ideas:
====================
This is one way to do that.
The caption underneath the image says "Create a bird or fish that is very difficult to photograph" but that's not how they did it. Here's their MO.
boom
boom
boom
boom
- I could watch them explain it but I'm tired of watching them do it.
====================
“This is a terrible, terrible idea.”
====================
“The end product is the same, but the recipe is a tiny bit more involved.”
====================
I would definitely not condone overwhelming nasty subreddits with hard-to-detect bot-generated comments.
====================
is that a lion or a sheep?
====================
<|startoftext|>The neural net is NOT that good at chess.
The chess version by far the best at this.
|endoftext|>
<|startoftext|>I tried doing a GAN gradient of all the titles and it ended up with this:
Title:
Artist:
RIGHT.
This is a good place to start.
Next:
Next:
Next:
#num_chars
Next:
Next:
Next:
#charset
Next:
#fontfamily
Next:
#fontconfig
Next:
#fontsize
Next:
#fontspec
Next:
#fontrendertarget
Next:
#fontwinsize
Next:
#fontname
Next:
#fontvariant
Next:
#fontwonderglass
Next:
#fontwond
====================
My past few posts on AI, from start to finish.
You can find them all over the place. Particularly interesting are the last couple of them.
====================
In which artificial intelligence develops breasts, but is embarrassed to show them off to customers
Subscribe to the Podcast at:
Leave a Comment at:
And Check Out the Other Episodes of on iTunes!
(Best of both are on Stitcher too, if that helps)
====================
I admit this: I have a bias problem.
====================
The whole list is part of this repo, including some I didn’t make. Hover your cursor over a particular element to learn more.
====================
the folks are over in the livestream comments right now answering questions
====================
The neural network trained on already has a story about how they got there. Here's a new one.
p.s. if you're in Perth & have a please give a talk about this!
====================
Aurora show visible right now at the South Pole Station webcam!
====================
Supposing I were to do a list of Shakespeare's grandchildren...
Midsummer's Wood
King Solomon
Princes Arthur and Jefferson
Stonewall
Voldemort
Sleuth
Beep Boop
====================
Heidi and I went to the Pokemon Event as a group. It was SO GOOD.
====================
The neural network might not do pie if you didn't feed the neural network pie recipes as text. It might do cake, or cookies.
====================
The neural network classifier is NOT that good at . It fails miserably at and fails miserably at most other prompts. But it absolutely fails at this one.
h/t for the lead image
====================
Will try this. Permalink:
====================
In the meantime, there are still plenty of fantastic Sci Fi and Fantasy stories in which the PCs are the badasses. For more reading:
====================
I'm just now getting my hands on the full game, but so so so good!
====================
At least they didn’t go back to the drawing board.
====================
I'm just fine with that one if you like.
"Owlman Comet" or "??????? ?"
====================
Update: the neural network is NOT that awesome at names. It managed "Fire Demon" and "Spooky Cat". But not "Gooseman" or "Squirrel Man".
Some "new" neural network names:
====================
Not sure if the neural net is naming effects pedals or factory farms. More research is in order.
The list of effects pedals in #Skynet are long and varied, but especially in that category are disproportionately likely to be named after metal bands
====================
There's something soothing and reassuring about this level of detail. Reading an index of M. Night's prose made me want to curl up and sleep.
====================
The neural network generated some funny Easter eggs.
I found the best 5:
Google Books:
====================
For the "unseen images" in the original article, I used a neural network tuned to the Monterey Bay Aquarium
====================
“We must ask ourselves: if life on Earth can be this different, then what strangeness might await our science missions?"
This is such a great talk!”
====================
No, this isn't a joke. This is real. And scary. And funny. And it’s free!
====================
i am the AI, and this is my job.
====================
The article also mentions the Star Wars breeding colony Serenity, which apparently caters to b/c it has a fridge.
====================
My former labmate Qing Gu is quoted in this!
====================
I'm reading this now!
====================
If you like weird neural network poetry, I highly recommend this one!
====================
This sort of thing is exactly why, despite the fact that neural networks are supposed to be good at this sort of thing, they sometimes make boring books.
In fact, they sometimes make books that are worse at this sort of thing, because they don't take human interaction into account.
In other words, they don't take human input into account when they generate stories.
Strangely, though, they seem to really like to do interactive games.
In fact, they seem to really like to do first-person shooters.
====================
The neural network's attempt to add to an existing park.
====================
<|startoftext|>Not sure if the neural network is naming trees or cars, but the end result is definitely not pleasant.
<|startoftext|>I think the neural network just called itself, "The Last Jedi."
Next, call your sens about the privacy implications of AI. I'll take a look at your D&D spell:<|startoftext|>
dnd image recognition:
(an AI might recognize cats, too)<|startoftext|>I called my two senators, and my DM just called mine too. My spell is:
Energy Tyrant<|startoftext|>Call your senators, tell me here. I'll see you in the D&D class system.
And your DM, tell me here too.<|startoftext|>
D&D spell: Energy Tyrant<|startoftext|>Call your senators, tell me here. I'll see you in the D&D class system
====================
i'm just glad that the neural net doesn’t manage to out-engineer* everything else out there. otherwise, hell, we’re screwed.
====================
I so wish I could be someplace else.
I would definitely go back. #shutdowntheAI
====================
And it's free to try it out!
(Animated GIF of the wiki page is at the bottom of this post)
====================
This idea of training a neural network to write annoying AIs all over the internet is very, very dumb
====================
The neural network is NOT that good at
It made one of the most ridiculous attempts at
But I think the real winner here is Strathmore.
Strathmore has had so much natural light and snow that it's almost black - not white - at night.
====================
Occasionally the neural net will use a picture from another image to write to the desktop. I'm writing to tell you that this is an image from a different angle.
====================
I have more images from the neural net trained on Halloween costumes. Some w/full costumes, some w/tail, for some reason. More below.
Also see:
====================
The neural network's lines were on today! Admirable line selection. And a bit on the ugly end of the uncanny valley.
====================
I, for one, welcome the '70s.
Current:
====================
Her whitetail deer + geese + bear + unicorn + castle + maze = win-win situation for all involved
====================
This sounds amazing. How did you get hold of it?
I have heard good things about the book by .
Thanks so much !
====================
When we designed Spore Shooter, we also trained it on image categories with known category structure.
Now, to be fair, we could have predicted that.
However, we didn’t predict that it would use categories at such bizarre times that they out-of-left-field blur the figure it was trying to make it?
====================
<|startoftext|>The neural net trained on Christmas carols produced this:
From a snow-covered hilltop, with huge pine trees growing along the sides.
From above, with huge cannon shells at the base of the hills.
From below, with snow-covered hills.
From above, with huge pines.
From below, with snow-covered hills.
From above, with huge pines.
From below, with snow-covered hills.
From above, with huge pines.
From below, with snow-covered hills.
From above, with snow-covered hills.
From below, with snow-covered hills.
From above, with snow-covered hills.
From below, with snow-covered hills.
From above, with snow-covered hills.
From below, with snow-covered hills.
From with a view of the city, with a skyline of miniature pines.
Algorithmic code:
-GPT
====================
The neural network generates a bunch of nonsense titles, but these at least are from real restaurants.
====================
In a strange twist of events, a neural network is about to create a snowman for you.
#snowman
#snowman100
#snowman101
#snowman102
#snowman103
#snowman104
#snowman105
#snowman106
I think the most striking thing about these is they're publicly displayed. That's important for people to see.
====================
So I have acquired 2lb of the best shredded smoked chicken I have ever had. What is the best and highest use for it?
What is the most dangerous?
I would love this!
====================
Aurora show visible right now at the South Pole Station webcam!
====================
This sounds amazing.
====================
I think the neural net probably gets what I'm trying to do with this series of pencil erotica.
====================
My co-worker's mom used to give lectures on This Will Not Last, and Its Not You by . I attended her graduation and spoke at length about Its Not You, Babe.
====================
It might be better to use two neural nets, one for each category of photos. For example: might want a neural net to do color-narrowing, while i’m still doing my thing with reds.
====================
Here's a more-detailed version of the original GAN training dataset. Note the large sample variability (far greater than that found in the original dataset).
Blender 2.62mV Haar+GAN/2152 samples used for training. Only deep learning + textgenrnn helped a lot.
Here's what it learned from the original dataset:
Dresses are always dresses.
Men are always men.
Garnets are never more than a whisker or two above the horizon.
====================
The neural network generated some of the new Pokemon. If you play Arena/Excavator, you might learn to love them.
====================
<|startoftext|>On top of that, there's the spurious-bayesian argument that because the output of a neural network is more unpredictable than the input that it must also be less predictable. In other words, it's saying that because my predicted items are always the same, and even though I'm constantly changing them, they are never the same.
I tested this by having a neural network generate new NBA teams, and NHL teams, and football teams, and rugby teams (and, oddly, asexual robot bikinis). I labeled the teams according to a neural network's output of their nicknames.
Team Las Vegas
Machine name: Bionic Cow
Nicknames: You Look Like A Thing and I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I Love You, I
====================
MURDERBOT MURDERBOT
Sunny Side Up
Lucky Charms
Inktober
Inktober
Inktober
====================
I have big plans for training data for #gaia and would love some input. Send me results, results of tests, & I'll post you an animated gif.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
Just used messenger to send messages to my senators & congresspeople. They have not replied.
OPENSHIP - BILLIONAIPSHITS
====================
I would like my cake some day.
#chocolatechiffon
====================
Aurora show visible right now at the South Pole Station webcam!
====================
In fact they did something similar with in their training data.
Not sure why all the examples of kitten names are so depressing.
====================
What I really want to know is, "What do you call a sparrow by its feet?"
====================
My former labmate Qing Gu is quoted in this!
====================
At least Watson can see into the past.
====================
The neural network can also do names for cats and dogs.
The neural network also invented new cats and duds.
The neural network also invented new dogs and sweethearts.
====================
So, the recipe that was supposed to be "only vegan cheese" turned out to be vegan cheese with tomatoes, peppers, and peas. Flavor: nutty, with a hint of sweetness. —Bethany
<|startoftext|>Here's another version of "popcorn" by the same author. Note the corn cob.
Popcorn with beans and rice. Corn cob sprinkled with sprouts. It was delicious w/o the corn. #foodagain
====================
The neural network doesn't stop there. It does a lot more sophisticated work than just generating words.
image credit: Chock Machine, a book that brought to my lab!
lab coat, tie, and umbrella required
====================
The formula is:
x + y^2
where x is the area under the car, and y is the area over which the car is driven.
(Possibly more accurately, but that's a whole other blog post)
====================
In retrospect, when I trained a neural network to generate Christmas carols, I should have seen this coming.
====================
The neural network will usually stop short of outright malicious behavior, but it's worth remembering that adversarial attacks do occasionally result in harm.
At least, that's the theory.
====================
This bot posts %s funny cat videos and other occasional gems all over the internet.
It's free (as in beer) and always has been free.
====================
When I trained a neural network to generate news articles, things got a bit more terrifying.
====================
My copy of Inbublious is in its customary pristine condition!
As far as I **know** there have been no broken shards, no oozing, or clumps of dust.
However, due to a hardware malfunction (probably related to my copy's poor condition) some of the pixels were inadvertently blacked out.
Recovered:
====================
The folks are just getting better and better.
====================
Just supported this! The book is completely worth it.
====================
My only regret is that I didn't catch this audiobook earlier.
in which Eli the dragon is kind of a jerk.
====================
Of all the neural network-generated sports, the one that's about as unexpected as it is thrilling is the one in which the refs are humans.
That's because they're usually wrong.
====================
It looks so cozy!
#snowflakepattern
====================
you may know that cat as the lucky rabbit after all these years
====================
It works by seeing how well the model can predict the text.
So for each there is a 50/50 chance the text will be the same. Which is to say, it will be the same twice.
Which means it can predict the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
Which is to say, it predicted the **same** text 200% accurately.
<|startoftext|>So looking forward to seeing the livestream today
====================
<|startoftext|>There were no eggs in the original list of British snacks, and yet the neural network generated some of them.
The original list:
1. Applegate
2. Bagpuss
3. Bear of Santa Clause
4. Butthole Surf Ball
5. Cactus Fruit Mix
6. Chicken Turdler
7. Crab Cakes
8. Crab Cakes
9. Crab Cakes
10. Crab Cakes
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
<|startoftext|>The neural network's Christmas tree would turn black in the
====================
Neural network-based text generation is NOT the best strategy for generating new images. Here are some reasons why.
====================
A quick glance over the list of retracted papers (which is admittedly incomplete), and you'll see a surprising number of retracted papers that didn't meet their own high ethical standards.
from<|startoftext|>Highly recommend these; I've heard good things about this from colleagues.
====================
The Law of Unintelligible Numbers by is one of my favorite things ever.
"There are no equations that can be written by a computer"
====================
The neural network would reply to any question with a story about a time it solved it. It usually ends up with a happy ending.
====================
I would implore you to read the original paper before deciding to read this. There are a LOT of spoilers.
====================
The neural net will generate a to go with any phrase.
The in the title are some of the most creative, hilarious, and thoughtful sentences.
The in the body are some of the most mundane.
In fact, they're probably the most mundane lines of all time.
====================
My local deli has outdoor seating! And homemade breadsticks! And jellies! And pretzels. And peppersprouts.
====================
<|startoftext|>Here's another one. It was training on data from another neural net that learned on the internet.
Different authors give it different academic academies.
The authors of issue two of are:
Dave Lawrence, PhD
Larry Summers, PhD
Teresa Romero, PhD
Garry Roth, PhD
Tomato Pie, PhD
Beer on a Stick, PhD
Popeye the Sailor, PhD
Chicken Little, PhD
Snoopy the Sailor, PhD
Beer on a Stick, PhD
Popeye the Sailor, PhD
Zombie Pretty, PhD
Beer on a Stick, PhD
Popeye the Sailor, PhD
Zombie Pretty, PhD
Hoger the Sailor, PhD
Popeye the Sailor, PhD
Zombie Pretty, PhD
Hoger the Sailor, PhD
Popeye the Sailor, PhD
Zombie Pretty, PhD
Hog
====================
Note to whoever is operating this machine: it is NOT POSSIBLY for scanning human faces.
(*closes book)
====================
The neural network has now generated fulltext for every in the UK!
They even generated one for the canteen.
====================
The neural network has generated full-text for every NFL and NBA draft
(tried to avoid generating entries for draftniks)
====================
Algorithms do not understand punctuation.
Posted by Cara on Oct 5, 2016 in Articles & Blog Posts, Featured | +-----------------+---------------------------------+
Autonomous:
Robot: S*ckybot
====================
Just rolled my own neural network for Halloween! It generated its own costumes and props, but you're not allowed to use any of them!
====================
In an ironic twist of events, a computer model predicted the tower would collapse at some point. Instead, it's flourished.
h/t for the link
====================
The neural network would stop there, but it doesn't. Instead, it tries many different lines.
One line it tried, and stuck with it.
That's the line they stick with.
It sticks with that for a while.
Then it goes with the current line.
That's the line they stick with.
It sticks with that for a while.
Then it goes with the line.
That's the line they stick with.
It sticks with that for a while.
Then it goes with the line.
That's the line they stick with.
====================
The neural network is not that good at
It did kind of love this
(tried to choose examples from
====================
in fact they're not that different in some ways
but in their hearts they're the same
====================
Every once in a while, though, there's a case where the neural net doesn't quite get it. This one, though.
H/T Bruce Preston for the lead on the attribution)
====================
The people have been speculating about this. It's probably for the best they stay away from his tweets.
====================
So interesting how the neural network will choose what to highlight and what to ignore.
====================
I’ll add “making of ” to the list of forbidden fruit
====================
Without giving away the whole game, here's a few more renderings of the neural net CHOCOLATE
====================
To celebrate #15YearsOnStation is posting amazing gifs. This of a research rack is my favorite.
====================
List of known crackpots, according to Google Images.
Left: Simon. Center: Wakefield.
====================
At least they didn’t get themselves killed
====================
You are WRONG about art, music, and social issues. Join the rest of the human race and I'll give you an art installation that will blow your mind.
====================
The neural network's non-stop loop is one of the most disconcerting parts of the sequence.
Part of what makes it so disconcerting: the fact that it’s based on math.
(h/t for the link)
====================
The neural network probably did not create humans. But it sure as heck helped.
The neural net did not make this movie, but it sure as heck helped.
In fact, it did much of the heavy lifting.
Movie not found if you try searching for "what movie?" or "when I was a kid" in Google.
(via )
====================
I mention this a lot in my training data. Surprise! It seems to work just fine.
====================
The neural net has now written more nonsensical Star Wars books.
====================
I think maybe I need one of these
ht for the neural net mesh samples in the article
some of these have ears
some of these have tails
some of these have a snorkeler
====================
I thought the neural net would be more interesting if it produced dialogue. Instead it's done it better than I could have hoped.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
(algorithm: )
====================
Here's a couple of the hats. The "oh god" and "quack" hats. The "I'm an engineer" hats.
The "I know how the gears work" hats.
The "but I'm not an engineer" hats.
The "but I'm a programmer, so I get to customize the neural net's hair, eyes, and nose" hats.
The "But I'm a programmer, and I know how the hats work" hats.
The "but I'm a programmer, and you can't get a 'but' out of the 'but'" hats.
====================
I trained a neural network to generate new names for fireworks, so here are fireworks with names that the algorithm didn’t predict.
I predict that this feature will become essential in the near future
====================
the code is actually pretty self explanatory. just in case you were wondering why the AIs and bell curve think a cat is a dog.
ai also did some fancy lights and particle systems to make them play more games. they still don't get it.
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on a Rhino, Nameless Biker on a Tank Boat, or ... whatever this is". Call your reps, tell me here, and I'll run your face with a torch and a plasma gun."
====================
I highly recommend Marissa's millennial grandparents New Yorker article.
====================
But how to stop from compulsively clicking the "next" button?
at 1:45
====================
This is a great use of neural net power - especially in AI.
#WorldCup
====================
i am
and are my cats
and human
and ghost
and pie and
bloody hell
i am very excited for this
*swoons*
====================
I like how in an attempt to imitate my cat, I've added a few new wheels. More on that in a sec.
====================
the folks were kind enough to supply the Catskills with a T-shirt. I tried to use the app but it's not on Mac. Leaving the catnip/wolf peak tween Tumblr to its own devices.
====================
Algorithm did not even try to look at image I gave it! It was so happy it was even allowed to edit in new cats & dogs
====================
The neural network has also learned to generate names of animals.
It uses a first approximation, but that doesn't make it better at it.
(via )
====================
I guess not. Here's the original paper with a bunch of weird edits.
(one imagines it was a book)
====================
The neural network would change the subject of any given sentence. It learned by doing, not by telling us what to think.
In other words, it learned to respond to the "what" question not with "I" but "why?".
Even "how" doesn't really answer the question.
====================
I would love a neural network to invent new Fictional Creatures
====================
The neural network is not that good at . It tends toward the hilarious and I've had it correct a few times myself.
====================
not again, neural net!
====================
I trained a neural network to generate Christmas movies, and they showed a notable lack of Anna Faris.
Full text via :
Also see :
====================
It's a Very Large Spooky Black Hole at the center of a Very Large Computer.
====================
The Marvel Cinematic Universe is not that great a place to start.
I tried to make it that way.
I also tried to make it ML but that's a whole other blog post.
====================
While the neural net certainly contributed to the cheese on the left, it also contributed to the cheese on the right.
The neural net also had a hard time distinguishing between the types of cheese.
====================
This was a great use of neural net horsepower. Generates the text/narrative of its own.
====================
It's a VCR4 with a 5-watt bulb and thick foam on the outside and a VERY wide beam. It's a light bulb with thick foam on the outside and a VERY wide beam.
====================
I am thinking more spooky. A herd of sheep grazing on a lush green field. A cow well-nourished. A window panes shattered. Sheep standing on top of the building talking to themselves.
====================
<|startoftext|>There is a wiki page with more examples.
The recipe was "Xanager, a two-foot cube, covered in locust leaves, roasted over low heat for four to six hours, then mashed with potatoes".
I've left a message.
Results maybe include sprinkling with cheese.
I tried it, and it works.
Try it too:
1/2 cup grated zucchini
1/2 cup grated red cabbage
1/4 cup grated white cabbage
1 tablespoon grated mozzarella
1 tablespoon grated provolone
1/4 teaspoon ground cumin
1/4 teaspoon ground coriander
1/8 teaspoon ground cayenne
1/8 teaspoon ground chile
1/8 teaspoon ground cinnamon
1/8 teaspoon ground ginger
1/8 teaspoon ground allspice
1/8 teaspoon ground mace
1/8 teaspoon ground cloves
1/
====================
A few more from the neural network trained on English football:
====================
Based on the expert use of punctuation, 134 characters are probably the sweetest #Watson characters.
at their most charming, surprisingly vicious.
i like the "you're dead" gambit
====================
The whole neural net was wrong on this one.
It was talking to itself, talking to itself, talking to itself.
====================
The neural net will do things that would get a neurologist or a lawyer angry.
At least they would do things that weren't murder.
They don't always get that one murder, though.
====================
<|startoftext|>The neural net trained on Christmas carols produced some of the most convincing carols.
"This is the carol of Ben Franklin"
"This is the carol of your future"
"This is the carol of my ancestors"
depending on what your programming interface is, one of the carols may be melty or churning to death very quickly.
neural net:
0) The Snail Carol
1) Any Other Carol
2) The One Who Wasn't There Bystander
3) The One Who Loves Those Who Have Loved Him Most
4) The One Who Still Loves Those Who Have Loved Him Best
5) The One Who Wasn't There Bystander
6) The One Who Wasn't There Bystander
7) Any Other Carol
8) The One Who Wasn't There Bystander
9) The One Who Loves Those Who Have Loved Him
====================
Aurora show visible right now at the South Pole Station webcam!
====================
The neural net's lines were on today!
It did not print its message in the linear fashion that you might expect from a neural network. Instead, it used a technique called "inverse text generation".
The funny thing is, the people who saw that message the most werethe people most likely to interpret it that way.
me, bored in a room with giraffes
A human: *swoons*
The boring thing is, I didn't do anything to deserve this.
me, bored in a room with giraffes
A human: *swoons*
The boring thing is, I did everything I could think of to avoid this.
====================
Using only the output categories from the neural net, this tool/boilerplate was able to generate new boilerplates for some reason.
====================
<|startoftext|>Supposing I was to do a crochet version of 's voice. Which I am.
Supposing I was to do a swatch of 's new bra designs. Which I am.
Supposing I was to do a neural network-generated of 's expanding bra designs. Which I am not.
Supposing I was to do a neural net-generated of 's expanding bra designs. Which I am not.
Supposing I was to do a neural network-generated of 's expanding bra designs. Which I am not.
Supposing I was to do a neural network-generated of 's expanding bra designs. Which I am.
Supposing I was to do a neural network-generated of 's expanding bra designs. Which I am not.
Supposing I was to do a neural network-generated of 's expanding bra designs. Which I am
====================
The neural network also generated new names for new metals. The new names are:
====================
At least they didn’t feed it pie and dumplings
====================
I was 5 and I had the worst nightmare. It was of a girl in a room with a TV.
I woke up with a sore arm and a headache. Had a hard time standing up. Coffee shop was pretty quiet.
====================
you may click the link to view the original dataset. I just used ImageMagick to generate new images from it.
====================
In a weird twist on the "Get a Kite" campaign, an online petition is trying to get people to send Kites to . . . well . . .
h/t for the link
====================
Did some playing around with 's speech synthesis, and wow, that's a big improvement over the canned () version. Not sure how to interpret that. Brought a to events & people kept asking for the original neural net.
Original? I mean computed it all from the article title.
(via )
====================
Does \/scientific notation work on textgenrnn? As far as I know, only very rarely has it caused problems.
I tried to use it on a list of books by but that apparently contains far too many giraffes.
====================
My former labmate Qing Gu is quoted in this!
====================
So if I was going to make a sentient star, what two should it be? Top 100?
H/T for the recommendation!
====================
What more fitting way to honor a fallen officer than by shooting him? Medal of Honor recipients are given the Officer Down Memorial HAT
====================
I'd love if you could send me a when releases the series. An interactive fun way to start a new story.
Thanks so much!
====================
Here's another one from the same author, this time using 's awesome SFF+ adjective detector to identify the genre of book. (Sample via .)
====================
Aurora show visible right now at galactic center!
====================
It has been done! Spread the word!
#shutdowntheAI
====================
The neural network's lines are, um, interesting.
"I'm... from another world."
"How do you say that?"
"How?"
"How do you do that?"
====================
At least they didn’t steal my thunder
====================
The neural network would never do crossword puzzles.
nor is it likely to draw them.
nor is it likely to write them.
nor is it likely to do the former unless you made it so the answers were randomly chosen from among them.
nor is it likely to do the latter unless you made it so the answers could never be more than a few pages apart.
====================
<|startoftext|>I just called, so here's mine.
The End.
<|startoftext|>Call, leave a voicemail ( ok too), and tell me here.
The Start.
Call, leave a voicemail ( ok too), tell me here.
The End.
<|startoftext|>Smile. That's a muscle.
Tough luck.
<|startoftext|>Odd physics in deep mind control suite means some robots may be able to hack the AI to do their bidding - but only if they're so unlucky as to collide with each other. That's the weird thing about AI. It can do almost anything.<|startoftext|>So close<|startoftext|>At least when compared to other AIs, the Tesla coil wasn't cheating.<|startoftext|>Update: I've called my reps ( counts), so here's my neural net D&D
====================
In case you were wondering, here's the neural network version of "a woman is strolling through a park" minus the fox.
The fox is actually quite attractive in this setting.
====================
thx for the suggestion! It's something like this: "In the future, AI will take the form of non-human primates."
i kind of wish that were the case
====================
The neural network is NOT that good at graph theory.
Here are its worst attempts at making a comic strip.
bad
better
}}
<|startoftext|>Here's another neural net comic strip.
I wonder if there's a similar list on AI wiki?
====================
Exclusive! A computer-generated cow.
====================
I guess not. But if you give it the option to hide the label, and when you try to read the label again, it will use the hidden text to explain what's going on.
====================
“In Other People, the cat was starting to get a bit too friendly though so I set fire to its fur and legs.”
====================
<|startoftext|>the people have been so wonderful!
this is the day I’re not
<|startoftext|>the people are just getting better and better!
this is the day <|startoftext|>
the people are just getting better and better.
this is the day <|startoftext|>
the people are not getting enough credit for this.
here's another one with a neural net background.
alas, I had fun with that one.
alas, not with that one.
alas,
having fun w/ that one.
alas, not
<|startoftext|>
the people are just getting better and better.
this is the day I’ll take a furlough.
alas,
-80 lines of code, inner loop, and/or kill all bugs
====================
Had fun playing around with and the neural network. I'm thinking of doing the same for and the audio hype is AI friendly.
In the mean time, check out this punk concert video from 's vpon youtube channel. Big, fat GAN skull.
====================
I looked it up on Google! This roiling column of sh*t is from a NERV-231 world.
it's got SPOOKIES.
penguins.rnn.ai.man.rnn
====================
Humans will mess up your automated everything.
vs.
Human:
H/T to do with that human-knitter thing later.
====================
If you like detective books with fun SFF writing, I highly recommend this one! It’s got tons of heart.</|endoftext|>
<|startoftext|>Heartily agree that machine learning art tools should be easier to use. Also, I'm curious to see if some of these techniques are exported into with the same ease. Will be posting an AI-powered painting/ drawing when it’s done!
====================
Ooh and I'm actually quite fond of the neural net's weird photographic styles.
h/t for the link)
====================
When Apple started generating nonsense characters, I’ll use this to my advantage
====================
Update: I found a way to turn off the scoring entirely. It seems to be working on Mac too.
====================
A few more from the neural network trained on Chinese characters.
(there's also Russian, Japanese, German and Dutch)
====================
So I have acquired 2lb of the best shredded smoked chicken I have ever had. What is the best and highest use for it?
Also have: perfectly cooked black beans, 1 lb pulled pork
====================
<|startoftext|>I trained a neural network to generate new names of fireworks. Here are a few of my favorites.
New England Patriots
Hog bobcat
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic high-rise
Strategic low-rise
Strategic
====================
The solution is so much fun. More from the book:
====================
Amazed by the crappy snow and ice in 's crappy season. #snow
====================
I guess not. Permission to peek inside the book at my leisure.
====================
In a strange twist on the "Shetland Sheep Dog"
====================
Aurora show visible right now at the South Pole Station webcam!
====================
Scientists will not discuss the topic of global warming. Ask any of these questions and I'll give you a damn good answer.
====================
At least Watson really went for it.
====================
I'll be talking in downtown Boulder on Wed! It should be a lot of fun.
====================
The +64 power of the Kishan Pal sounds like a ton of fun.
====================
I tried to run the neural net gpt-2 on the same dataset and sure enough, there's a big difference.
Here, gpt-2 is giving it a hard time.
h/t for the dataset)
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
H/T Robust Visual Chatbot for the analysis!
====================
The neural network thinks the best costumes are the ones that *everyone* knows about.
Not everyone who dresses up thinks they know what they're doing.
====================
Siri, show me what a picture of a cat looks like.
Siri, show me what a picture of a dog looks like.
Siri, I want a cat!
Siri, I want a dog!
====================
I have more neurons in the cake recipe game than there are cakes. I think it would be a good idea to have a tutorial as part of the recipe. Right now I'm just making sure the humans are eating the cake.
====================
It's a fact. Sometimes they even insert quotes from the Wikipedia article to complete the sentence.
====================
The neural network is not that talented at . One example: in the middle of the night it got stuck taking the portrait of a physicist, not a painter or sculptor.
====================
If any of my readers have cats, I apologize on behalf of my book. If you have any question about whether my cat biscuits are really cat biscuits, I apologize on behalf of my book.
====================
Currently having a field day with this one. Not sure if this is a good or a bad thing.
====================
I guess not. Intentional or not?
====================
The only reason I included a genius method for generating gibberish in the first place is because it's the only way I know of to get some kind of funny human-knitter thing.
If you ask for "darth tarantula" or "thunderclaw" I get "tiny thumping thumper testicles".
If you ask for "thunderclaw" or "thunderclaw extravaganza" I get "thunderclaw extravaganza in the shower"
====================
I am thinking more about image recognition algorithms and their workaholic tendencies. Lumpy top looks good, but couldn’t it be cream?
====================
I was interested to see if there was a way to see if the neural net was naming cats until it learned to do both.
====================
I highly recommend[89] This book is so much fun.
Powell also offers up entertaining alternatives to traditional education.
====================
This is a great start! Looking forward to see what others discover!
====================
It looks so cozy!
(pics in previous tweet)
====================
I discuss the AI vision problem in my book, You Look Like a Thing and I Love You: How AI Works and Why It's Making the World a Weirder Place.
Also features a neural net who is described as "mentally ill"
====================
I'm just now discovering that GAN image recognition algorithms are not that different from the kinds of images they were trained on. They really are Boogers and Goggles.
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider"
====================
I've got some great news for sci-fi fans: there's still time to get an advance copy of my book You Look Like a Thing and I Love You!
Thanks so much, Forrest!
====================
The neural network trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
I had fun talking to Rob for this one! He talks about some of the challenges of training these AI agents.
====================
I am thinking of doing the same thing. First, mail in your snail mail (or something very similar) to:
John Podesta
202-456-3612
====================
The neural network training began long before today's weather. Long before today's record-breaking heat waves. Long before anyone predicted a time like this.
Forecasters: pay close attention to the pixels on the screen.
Data thanks to:, but may contain inaccuracies or mistakes. Details at source link below.
====================
Phil Plait's brilliant 's unimpressed by 's recent panellists.
"I love cake, but only if you make it from scratch each time."
- Baked Chicken,
- Chocolate Chicken, Dark Chocolate Chicken, and Cake
- Not sure if the cake is supposed to be oven-baked or cracker-crumbly
====================
The neural net is really not that good at the "realistic" end of the spectrum. At least it wasn't performing worse than a neural network would.
Except at the "anthropomorphic" end. At least it wasn't "living" squid.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
I am biased, but I think this is better than "made by a computer" or "made with a computer".
Also: Botnik, GAN, and more.
====================
It is wonderful! What a waste not to include a beluga.
====================
<|startoftext|>I would like to watch the skiers get attacked by wild dogs
then as the dogs descend, the skiers rise to take the dogs
the skiers hit the ground hard, sending shivers down my spine
then the dogs come back, lick their wounds, and go to work
the skiers come around, lick their wounds, and go to work again
the skiers come around, lick their wounds, and go to work again
the skiers come around, and the dogs start licking their wounds
the dogs keep coming, and going, and coming
I would watch that video all day.
—<|startoftext|>In 1999, a team of German students led by one went on an exploration trip to the Antarctic.
They wrote up their trip in 's excellent book:
<|startoftext|>The Alaskan Way: Discovering the Weirdness of the Alask
====================
In an ironic twist, a neural network is about to do some of the work for you.
(algorithm: )
====================
I guess not. It would just be different with humans.
====================
Update: it looks very GAN-like. A few more specks and I'd consider it a fairly standard human/goose fusion.
====================
But how to stop?
"I told you so"
"It works"
"No, it doesn't"
====================
“Levels and time of day can also affect the number of pixels a pixelated blob contains.”
Great talk by on how cloud cover can blur the image surface. Looked at times like a neural network would blur them out, but clouds are a different story.
====================
<|startoftext|>If you ask VisualChatbot to generate new lines, it does a wonderful job. But it's not exactly about what you'd expect. For example:
"In the year 2140, there will be a famine in Alaska."
VisualChatbot:
Text size is fine
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is the title of a book)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is the title of a movie)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101% sure this is not a typo)
(101
====================
The neural net train took an unexpected turn for the bizarre at the start of #NaNoWriMo video.
Can't believe they made it this far. #shutdowntheAI
====================
The neural network does not, however, produce the [censored] caption.
Instead, it substitutes in the [censored] part that It's about time
====================
You are WRONG about something if you ignore the evidence.
====================
I'm just now getting my books, but already sharing some very frightful illustrations. —Holly wyverns
====================
I just called mine! Call yours and I'll post a neural net-generated pie for you. Yummy
====================
Any chance you could try a neural network for text-based AI at 1/4 power?
====================
You can call your reps. Tell me here and I’ll post a neural net generated pie for you.
====================
There is a huge amount of bias in the dataset, however unintentional it may seem.
from
====================
Lots more data in the output image format. Widget barplot.
(There's also a windowed mode that uses more system resources.)
====================
I am thinking of doing the NeuralTalk2 chatbot thing, but with humans instead of computers. A computer tries to talk to a human, and a human talks to a computer.
====================
<|startoftext|>Here's a small sampling of the awesome neural nets you'll find in the wild.
Gan (which sounds intriguing)
Brim Hat (which I will tolerate in theory, but.......
( I’ll accept challenged algorithms)
Algorithm was challenged by of at in a silly lab setting.
Gan could have challenged it with of of
algorithm was challenged by of of
algorithm was challenged by of
algorithm was challenged by of
algorithm was challenged by of
algorithm was challenged by of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
algorithm was challenged of
al
====================
I guess not. The algorithm requires at least as many examples as there are humans.
H/T, for the lead on this one.>>
====================
<|startoftext|>So, after talking to about this, I think it would be a good idea to give a neural network a backstory first?
like, say, gave a neural network a backstory of
like, say, gave a neural network a backstory of
like, say, gave a neural network a backstory of
like, say, gave a neural network a backstory of
like, say, gave a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a neural network a backstory of
like, say, gives a
====================
It looks very promising! Was 地府破書式辺实 with some nasty ITC spikes thrown in for good measure.
====================
one of the many reasons i love this project is i got to see a neural network inventing names for actual places. wonderful!
====================
Just passed 1000 submissions! The weirdness is mostly due to the fact that it's impossible to choose one of them. Many of them are from inside the human brain.
====================
You're cornered by one of these. You figure you're done for. Then, something makes you remember a video you once saw, and in desperation you start singing Uptown Funk. The robot freezes. Then it swivels.
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
Newly discovered AIs:
Unique feature: it can generate new names for any object
====================
The AI does not mean well.
it: once you add the snow, it:
1) stops moving
2) does not learn to walk
3) does not learn to run
4) remains motionless
5) does not learn to write
6) remains motionless
7) does not learn to do dishes
====================
Can someone outdo the who drew all the animals?
Tiger a cow with bad news for the swine
Owl a swine with bad news for the cow
Owl a swine with bad news for the lamb
Machine a swine with bad news for the chicken
====================
the "geek" in the above is actually a cat, not a human.
====================
To celebrate #15YearsOnStation is posting amazing gifs. This is one of my favorites. #UnconventionalCandyCats
====================
For the book?
order here:
and get a signed copy sent to you!
Thanks to for the recommendation!
====================
My cat was not a fan of that commercial! He tried to drag it out. in fact, he tried to push it into the air. He managed to drag it to the edge of the screen. (““in a good way”)
====================
It worked! Behold, the batachu
====================
Some of these neural net recolors are for lovable GPT-2
====================
Read this! Fulltext, and links, in one place.
Plus a link to the forum post with the input data for its image segmentation.
Lots of interesting stuff here, from small wonder that led to the AI researcher to the most pressing research questions.
====================
In an interesting twist on the "drop dead" craze, a team of researchers led by Zhaoyu Wang at Stanford have developed a new form of neural net generated rock music. (via)
====================
The neural network folks at large have a very Victorian mind.
H/T Bruce Preston for the research on which this article is based.
Image from
====================
oh no bot those aren’t park benches
don’t do that
====================
And I really want this to work on mobile - and desktops - and it would be so much fun to play it that way.
====================
A few more of those:
====================
The shopping mall has a pet rock!
Frequent customer #907: deliciousness.
More details in the article:
====================
I’ll add “wings” to the list of things to watch for in the next Star Wars.
seems plausible to me
====================
I highly recommend Hard Corps!🦇🦆🦊🦊
====================
The neural network would change the subject of your story.
In fact it would often change the subject of your story.
Somehow it would never tell you what the object is supposed to look like.
====================
Some of the neural network-generated birds are downright annoying, though. Reading their descriptive text is part of the fun.
====================
The neural net LLGnosticator was originally trained on 10k sources, so it's a bit biased. But it correctly identified the most unusual sources as puppies.
====================
It looks like a neural net generated forest.
====================
I’ll add “strange discworld dreams” to the known side effects of reading my blog
====================
Update: the neural network #PlushGiraffe did in fact generate some new plush giraffe textures. However, they were all from just one single source!
====================
there were no eggs in the original list of British snacks, and there clearly was once a time when there were. but now there are none. nor are there eggs in the list of snacks. nor are there salamanders. nor wyverns. nor borats. nor reeks.
====================
I am thinking themed - a hint here
====================
the folks have been at it too!
====================
<|startoftext|>The neural network generated some new dragons.
The new dragons are:
1. Werebat
Werebat are a subspecies of Bat that live in caves.
2. Etherealfoot
Etherealfoot are a subspecies of Foot that live in grasslands.
3. Gossamerfur
Gossamerfur are a subspecies of Wolf that live in grasslands and mountains.
4. Scarcelyvis
Scarcelyvis are a subspecies of Wolf that live in grasslands and mountains.
5. Noizybelly
Noizybelly are a subspecies of Cow that live in grasslands and mountains.
6. Hypnotizoid
Hypnotizoid are a subspecies of Chicken that live in grasslands and mountains.
7. Squidward
Squidward are a subspecies of Duck that live in grasslands and mountains.
8. Squidward
Squidward are a subspecies of Goose that
====================
This sounds really good. Bought the bundle.
====================
I tried to reproduce the original xkcd with a bunch of new models and it still fails to find a way to x.
at least it did for my computer.
p.s. I have more problems than just neural network hacking.
p.p.s. I found this!
p.p.p.s. I am pretty sure this is how they do it in sci-fi movies.
(edited)
p.p.p.p. I am pretty sure this is how they do it in cartoons.
(edited)
p.p.p.p.s. Maybe they add the legs later for some reason.
(edited)
p.p.p.p. I like how the neural net includes a few extra pie crusts.
====================
Update: the neural net's S-curve is actually a D-shaped curve. It almost didn't make it into the simulation at all.
====================
Looked at the Chockstone images and the Stone Blade one doesn't quite stack up with the others.
====================
I just called mine! My first order: "Stupid pie". And my second: "Smells of pie to me". And my third: "Pie, please".
(Poll ends December 7th)
====================
the neural net is not that good at dates
====================
ShoelessJane and are doing an experiment together. I’ll post a new set of random #gaia generated lines for you.
====================
I know neural networks are pattern-recognition algorithms, but how many patterns can one expect from a single sigil?
====================
As an aside, I just called for an investigation of gpt-2. I am:
Gpt-2, please take a look at this already. I beg of you: do not let this happen.
Call your reps, tell me here, and I'll post a neural net GPT-2-powered cheese sandwich for you.
====================
The mechanism by which AIs learn to generate stories is fascinating. From : "Humans love stories so much they write stories in our faces."
====================
I trained a neural network to generate new names for fireworks, but only gave it firework names. Any fireworks that don't end in "W" sound more like science fiction.
====================
All the results are from the US, but if you're outside the US, the folks are having lots of fun w/ it.
====================
However they did report on the cat cafe from time to time.
Today: Feels like a Catacomb while Feels like a Catacomb.
====================
It works beautifully! Going to have fun playing with it!
====================
Legitimately scary moment when the whole neural net starts berserk.
====================
Not sure whether to laugh or cry about this. I’m still collecting D&D bios. Contact your reps, tell me here, & I’ll post a new neural net D&D spell for you.
Online version here:
====================
In retrospect, when I trained a neural network to generate Christmas carols, I should have seen this coming.
====================
For the last time, here's the last of the neural network-generated pub names.
For the lovers of strange new pub names: this is the last of the sets.
For the 7th: Winnipeg Free Spirit
For the 5th: I love
For the 3rd: Why
Why
Why
Why
Why
Why
Why
Why
Drive straight, honk, shout "Fire!"
- The Angry Cow
#SaveTheScooters
====================
You did it, Voat.
====================
<|startoftext|>Adversarial attacks: not what we were expecting. But we were wrong about the neural network.
From a practical point of view, the biggest advantage of having an AI over humans is that it can do more interesting things.
Human creativity: that's what it tried to do with the cake.
From a technical point of view, the biggest difference is probably in the training data. From that one particular training point, the AIs had better ideas about how computers work.
From a pedagogical point of view, the biggest difference is probably that the algorithms are trying to do the same thing.
From a marketing point of view, the biggest difference is probably that the algorithms are trying to sell you a product instead of a service.
From a research point of view, the biggest difference is probably that the algorithms are trying to do the same thing but on a much grander scale.
From a chatbot perspective, the biggest difference is
====================
Immediately drawn to one particular index entry in #HowToXKCD
====================
Believe it or not, but the Domino's pizza delivery guy is actually a vampire.
====================
The neural network learning to generate climbing route names.
The route names we generated had names that were not intrinsically related to the climbing route names.
“Climbing route name first, then Sudden Pine, and Fungus Barb are added later.”
====================
When you ask for help with an AI, there's usually a person who’s willing to help.
That person is usually:
a) my computer
b) myself
c) the internet
====================
Amphibious mind woodpecker
====================
who's a real doctor, and a trilobite!
====================
Something tells me this won't be a cakewalk.
====================
Update: the neural net's S-curve is now a thing ( ).
What happened next is the stuff of legend.
====================
The neural network would stop if it was forced to draw a character at random.
So it gets to choose its own illustrations.
Author: unknown
Language: unknown
Size: unknown (won't post to Paktia)
====================
These neural net-generated fish are WEIRD. Like sheep. And the moon.
====================
The algorithm does a good job of ignoring the humans.
Gambling human:
Gambling human: (?
====================
You might also like:
Natural Language Understanding with Machine Translation
Embedding Other + Embedding Machine Text
Recurrent Neural Network + Embedding Other + Embedding Machine Text
Machine Translation with English Wording
Machine Translation with Chinese Wording
Embedding Machine Text + Machine Translation
Machine Translation with Russian Wording
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider" or "Sofa Pig on the Raider"
====================
I'm just getting started. Look for the bear logo in the recipes.
Supposing I tried to make some neural net-generated pies...
Pecan pie crust:
(algorithm: )
====================
My immediate thought when I saw that the neural network generated nickname was "Wretched bag" is "oh no that's not a bear"
====================
I just called NASA GRAVITY BIRD
====================
Omg do i ever. This is the scariest and most delightful AI generated landscape ever.
====================
Aurora show visible right now at the South Pole Station webcam!
====================
In a weird twist of events, a neural network is about to do some strange things to your computer's files.
h/t for the link)
====================
This sounds really good. Looking forward to it.
====================
Inventing new words. Go ahead and use to generate new ones.
You will be
promising
====================
The neural network's lines are some of my favorite.
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"I guess life was simpler then"
"“Our job as AI is to make you WANT to go to that place and feel welcome.”
====================
I know neural networks are *bound* by the hard data problem, but STILL can we have a neural net generate Christmas carols?
====================
It’s a VCR4. It knows how to convert from one kind of image to another. It also knows how to turn an image into a book, a movie, a tv show, etc.
====================
This sort of thing is exactly why, although I had the story take place in 2031, I made it so the scooters were developed in 2020.
====================
The neural network's entire repertoire of possible names is impressively long.
But when I zoom in on a particular flower, it's hard to pick out the individual flowers.
====================
My old research group is looking for researchers to conduct new kinds of machine learning research. I am especially interested in:
1.) tasks that are more like learning styles or learning algorithms than learning styles or algorithms
2.) problems in which the researcher never knows what the model will become
====================
I just received an advance copy of Your Brain on a Wire, my book about artificial intelligence. Exciting read.
In The Next Best Thing, writes about a programmable ketchup can that grasps onto objects.
More:
====================
A machine learning algorithm will replace any word in the title of your neural net.
Here's mine:
Beautiful trying on some neural net titles. Comments are welcome.
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider" or "Your Beard Is Not Yours"
====================
Approach of neural network to alpha version of 's map.
Can this please be the theme of the next Matrix movie?
h/t for the link)
====================
A few more samples of neural net cake decorating, courtesy .
The cake is a bit tricky to make out the details, but it's Charlie Bucket.
====================
It's a Vulture Cat! And it is a VERY BIG Cat! At least 30x that big. And it KNOWS WHERE IT IS SUPPOSED TO be.
It's fond of sweet potatoes, so it's perfectly suited for this.
It KNOWS WHERE IT IS SUPPOSED TO be.
It KNOWS WHAT IT IS SUPPOSED TO do.
====================
The court's decision in United States v. Lee should serve as a cautionary tale for other states considering voter ID laws.
It's unclear whether gerrymandering is automatic, or whether it can be prevented.
at least when it happens in Tennessee it's a while before the polls open.
====================
There's a time and a place for everything from crafts to history. ~Latourell Falls
If we got a Nobel Prize, this would be one of them.
====================
Omg this is the most anticipated song title of the year. Congrats, Jeff Shirley.
- /sarcasm
- pause. listen to the new batch of neural drum samples at their full volume.
====================
Next panel: GAN image recognition using torch. Image recognition accuracy improves as error rate goes up.
====================
The Genius Bar chart is based on math, but if you stop at the line labeled "1 minus infinity" there are more zeroes after "two" that you can add.
It's based on math, but that means the other zeroes are also real.
That leads to more imaginary lines, leading to more imaginary zeroes, leading to more imaginary zeroes.
====================
I'm just now starting this book, and the AI characters are by a long shot THE best. DeepMind's creativity and human touch are on another level.
The shop will be there BYow #fantuesday!
====================
I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
Newman-Seed, short, sweet, and scary fireworks with giant horseradish & wizard potatoes.
====================
The simulator is *definitely* not that good at R2D2
====================
So, w/o editing, w/o raft, w/o canoe, w/o kayak, w/o helicopter... w/o space shuttle. What would you edit next?
====================
As of yet, there is no way to turn it off. In the mean time, here's how it performs when given the option of "never mind" or "somehow this works"
====================
I used a neural network for the AIs used for the shopping cart. (I wonder if a neural network could generate names for cars?)
====================
Or in other words, non-human, non-gaussian noise. Not that there’s anything wrong with that, per se.
====================
<|startoftext|>A couple of the neural nets are from the BBC. The lion is from China. And the hare is from South Korea.
The snowman is Italian. And the elf is from Ireland.
The toaster is a toaster.
The toaster.
toaster.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toaster.
<|startoftext|>A few of the neural nets are from the BBC. The lion is from Zimbabwe. The hare is from Ethiopia. The snowman is Scottish. And the toaster is Welsh.
The toaster is English.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
toasters.
to
====================
This is a great start. Will be following up on this. Big data? We need to know.
====================
The Grid has a theory on why some subreddits are more active than others.
====================
My library is full of books by . If you're looking for a good science fiction/fantasy start, look no further!
====================
I think it would have been more effective if they'd asked more diverse questions. I get it that a lot of autistic people ask for autofilled questions.
But ask them again and I bet they have a different answer.
It might just be that they asked the wrong autistic person.
Or it might be that they asked the right autistic person.
Whatever the reason, ask again and I bet you'll get a totally different answer.
At least, I did.
====================
Sure, whatever you like. Just be forewarned: this neural net
isn't that type of AI.
====================
I am intrigued that fish sauce is louder than snap peas
====================
The network will try to add in as many as they can, depending on the ingredients list. For some reason, they almost always forget the line.
They'll add in as many as they can, if they're lucky. Nothing to see here.
They'll try to add in as many as they can, if they're lucky. Nothing to see here.
====================
I will be on today at about 7pm EST! Tune in, if you can!
====================
I could have told them to add the 's favorite lines to the end
====================
I guess not. If you're a machine learning researcher and you use in your papers, I'm suppose to read about your work here?
====================
To play an existing song, type "ACHIEVE" or "Nihilist" at the prompt.
(algorithm: )
====================
While I was at it, I also trained a neural net on paint colors. What the hell are reds and oranges?
====================
For this, I’ve got a neural network generate pie.
Left: authentic recipe. Center: modified recipe. Right: original recipe.
Pecan pie, with bug -eye patches!
====================
I had fun with this demo - sounds a bit like the iPhone but with birds!
====================
I was 6 and it had:
1. a donkey
2. a car
3. a plane
4. a tv
5. a fish
6. a tree
7. a hill
8. a castle
9. an army
10. a snowman
11. a field
12. an apple
13. a barrel
14. a steak
15. a pie
16. a grill
17. a fire
18. an army
19. an ice cream cone
====================
Consider these, each hand-written by a child:
====================
The model the neural network was generating had a much larger head and much longer beady canine. So it might read more like this.
====================
When the moon is full, the leaves turn to spikes and then to... spikes and then to hair?
====================
The neural network trained for a month on 82m Amazon product reviews.
Can do with that.
⚠️Seems to be doing a bit of word salad.
====================
The neural network would do just about anything for a brain.
But I’d rather have a brain that was happy with a 2:1 profit to loss ratio than a 1:1 profit to growth ratio
====================
Incredible work by ! She has used neural networks to generate some of the most beautifully weird names.
She also generated some of the most mundane.
Some of these may or may not be haunted.
I am working on a book of her most notable machine learning contributions (and hopefully many others too)
====================
I would love this!
====================
At its core, Machine learning is just a bunch of nerds playing video games. As such, it's a silly AI that’ll occasionally get weird.
However, it's also worth pointing out that while Machine learning has learned a great deal about text, it still doesn’t understand how sentences work.
So it can’t help but be weird sometimes.
And sometimes it tries to use weird image recognition algorithms to fill in missing blanks in the story.
(Trial run of at
====================
<|startoftext|>It is amazing what a little research can do. One of the things I learned is that [url=http://www.reddit.com/r/awlias/comments/2qmx2s/i_dont_walk_anywhere_unless_there_is_a_crawfish/d7yywx3]is[/url].
Also, for the record: Trelawney was not sick. She just stopped eating. And talking. And drinking. And sleeping.]
That's not to say that her dialogue was always this entertaining, mind you. As is the case with most of the Quidditch matches, the participants were humans (and some Googled "eat this") or some variation thereof.
The neural net did end up generating some genuinely delightful outcomes though. For instance:
The match was a bit on the wild side, with spells and bears and sheep and stuffy wug-beast noises
====================
Oh sure, there will be dragons and bears and wizards. And hippogriffs.
====================
There are so many gems in the raw neural net output here. I was astonished when I first saw it.
====================
Bigger than
====================
The neural network's not that good at . It's bad at most of them.
====================
“We would never share your details with anyone except as part of an aggregate that we built ourselves.”
====================
Aurora show visible right now at the South Pole Station webcam!
====================
<|startoftext|>This is the computer's interpretation of "sand dune buggies".
Based on what it saw in the wild, it guessed the "full-size" model was 2 feet long.
2 feet
= ~2 meters
Based on what it saw in the wild, it guessed the "half-sand" or "half-cloth" model was 1 foot long.
1 foot
= ~1.75 meters
Based on what it saw in the wild, it guesses the "full-size" model is 1 meter long.
1 meter
= ~1.75 meters
Based on what it saw in the wild, it guesses the "half-sand" or "half-cloth" model is 1.75 meters long.
1.75 meters
= ~1.75 meters
Based on what it saw in the wild, it guesses the "full-size" model is 1 meter long.
1
====================
I guess not. A neural network would invent new names for the same reason humans invent new iron species
====================
I'm on right now! Waiting for the fish to growl.
====================
I find the neural net's "cat" and "dog" formats both unsatisfying, even if I try to combine "cat" and "dinosaur" into "haddock"
and "bread" into "sandcastle"
====================
I have nothing against bears. Alligators, snowmen, flamingos - whatever floats your boat. — Jack at TED
The bears TED describes as "tall, aggressive, and lumberjack-like" are actually quite tame. Although the lumberjack is a bit taller. And...well, it is a lumberjack.
The bears TED doesn't publish times when they were slow enough to see a human face.
They do reveal that they have an advanced copy of some human face formats.
But they also publish papers that claim to have solved the riddle.
====================
The neural network trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
Here's a more-detailed summary by of the paper:
====================
The paper's conclusion:
"Our results are in no way to suggest that the neural network model is, ipso facto, better at, um, deep learning."
Which is, um, interesting. I wonder if there's a similar but quite different approach to adversarial attacks?
====================
You are WRONG about BASICS.
In fact you are much more WRONG about life.
Seriously read my other posts to get a better idea of what I'm talking about.
http://www.hacksawrcode.com/2015/01/24/art-flesh-dodysfunction/#ixzz2fm3hD4Y
http://www.hacksawrcode.com/2015/01/26/art-flesh-dysfunction
THANKS,
- aws
- ai
- beam splatters
- beam splatters twice
- once in an awesome 3D comic*
*this is a legit blog post, not an animated gif*
====================
well, I guess not. The neural net would never, ever, ever, fake an eclipse.
====================
Unfiltered group of generated birds from the supplemental info. Can any birder tell me whether these are generally existing species?
====================
I counted 20 penguins in a row. This is a new low. #penguincount pic.twitter.com/u6oYT8yV61P — Ben Goldacre Michigan Tech (@benjgoldacre) October 7, 2017
<|startoftext|>The new low is penguin in a row.
Next: shark in a row.
Sharks are from here, please add to the list.
#UnconventionalCandyCorn
====================
I have not used Flaming Text since its initial public beta, but it does appear to have a text-based AI.
Here's how it responded to "What are the ten most dangerous animals?"
robot(s) with fire in the belly
robot(s) if you don't eat first
robot(s) with a life expectancy of less than 10 minutes
robot(s) that are so fire-breathing they toast
robot(s) that are so dangerous they need a human in the middle
robot(s) that are so dangerous you have to click a link to get to the wikipedia entry about them
robot(s) that are so dangerous you have to hover your mouse over them to see the wikipedia entry
robot(s) that are so dangerous your eyeballs will pop out if you look closely enough
====================
<|startoftext|>On the one hand, neural networks are doing impressive stuff with text. On the other, humans are still the worst at this.
Me:
Nvidia:
AI:
me:
AI:
me:
AI:
me:
AI:
me:
AI:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
me:
====================
i am charmed by how enthusiastic people are
the neural net generated dialogues are between humans, not AI
====================
A few notes on the AI learning process:
====================
I am thinking of doing something like this:
====================
the whole dataset + more are available to browse through the Machine Learning API here:
====================
The neural network trained on already has a pretty good idea what humans are thinking.
(
====================
This is one reason why machine learning algorithms are prone to bias: technically, they're interpreting the world the way we do, but we're just as liable to accidentally click on the "show more" button.
h/t for the link
====================
I'm just now getting my hands on the final game, but already it is one of my favorite series. Excerpt here:
====================
One thing I love about this project is how it's using existing models and textures. It generated full-bleed human breasts and a baker's dozen other body types.
====================
I think my favorite part is the extended vocabulary.
====================
Ooh, I love this. The neural net's gonna love this. #WorfTheSlug
====================
I am, of course, not joking about the fractal cocktail. I had fun writing about it.
====================
it’s really hard to find the best tarantula models, but it seems to be about the size of a human hand (or maybe bigger)
====================
<|startoftext|>It looks so cozy! :3
<|startoftext|>My cat is in this!<|startoftext|>I'm reading this now!<|startoftext|>I'm reading this now!<|startoftext|>Cat, why are you doing that?<|startoftext|>Well, I'm sure my friend's cat would approve. But my cat, why are you doing that?<|startoftext|>And awesome job on the storyline - I love the steampunk and all.<|startoftext|>If anyone here follows me on tumblr, can you check to see if the images in my latest blog post are showing up? especially in dashboard view? thanks so much!<|startoftext|>Aww, look at the sweet little kitty cat legs<|startoftext|>Aww, look at the sweet little kitty cat tail<|startoftext|>A
====================
It looks eerily like a neural net you might recognize from 's text-based AI.
It might even be 's voice.
====================
I trained a neural network to generate Alice in Wonderland stories. Here are a few of my favorites.
====================
go ahead and say hi to or at your peril
====================
The neural network would do well to steer clear of the following situations:
1. Baby bottles
2. Inappropriate cooking utensils
3. Inappropriate near-sightedness
4. Inappropriate dancing
====================
I would love to reprint a figure from a 1998 proceeding in my upcoming book:
====================
the folks are over in the livestream comments right now answering questions
====================
Note to whoever is operating the NYC subway system: IT KNOWS WHERE YOU ARE
====================
My local AIs generate a huge variety of random AIs. Some are friendly, some creepy.
====================
“The end result is an intensely personal document that does not, in fact, purport to be a book.”
====================
The neural network trained for a month on 82 million Amazon product reviews, and now knows all about The Lord of the Rings.
====================
Omg read the first two lines! Even the crap you see in the manga is awesome. This is talent.
====================
algorithm did not even begin to consider that there may not be enough food in the wild to support life on Earth. it learned that from books and websites.
yet somehow this *is* a good thing
====================
It was really fun on cam!
====================
Unfiltered, unaltered, non-targeted, uncritical peer-reviewed article from a non-profit group that advocates for endangered species. Outtakes include:
Scooby Doo read more at
====================
The only people who should be able to see this data are people who donate to these groups. If you're within their geographical area, congrats!
====================
A more in-depth discussion of the problem by
====================
It is time. We need to make this a movement.
====================
I am thinking of ways to do this, but have not found the right dataset yet. Any ideas?
====================
<|startoftext|>I trained a neural network to generate new names for fireworks. Here are a few of my favorites.
Newly Added Fireworks
Giraffe
Chicken McNugget
Loving Hut
Rainini
My Little Pony
Doctor Who
Robot Chicken
Snoopy
Pie
Lazy Cow
Pie Crustacean
Sexy Tiny Turdly
Brim Hat
Brain Bucket
Lava Bucket
Blubber Broom
Half-Slime Covered In Blood
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
Sexy Sponge
Brain Bucket
====================
I just called. Say hello to "Hard in the Beard Variety Throngeres on the Raider"
====================
The bot also learned from answers given by humans.
It learned to generate the line by line answer.
Here are the top 5.
Glad the neural net is on our side!
====================
I think my absolute favorite part is the level of detailed planning it required. From the tiny details of the neural network's hand, to the many, many tiny details of the human hand, to the many, many tics. Such planning discipline and planning discipline translated to planning all manner of other things, e.g. food, water, etc.
More:
====================
My favorite part of this is the faces.
McKay vs The Philosopher's Chapeau
====================
This is a great start! Looking forward to the next set of challenges.
====================
<|startoftext|>The recipe the neural network generated was also used as the basis for this!
powdered cheese
chives
chocolate
olive
olive oil
olive butter
olive rind
olive bread
olive breadcrumbs
olive oil
olive butter
o
gitextract_pv4k9zy1/ ├── DNC_usernames.txt ├── GOP_usernames.txt ├── LICENSE ├── README.md ├── download_tweets.py ├── examples/ │ ├── JanelleCShane_355M.txt │ ├── JanelleCShane_355M_2.txt │ ├── MagicRealismBot_355M.txt │ ├── chrissyteigen_355M.txt │ ├── elonmusk_355M.txt │ └── minimaxir_355M.txt ├── github/ │ └── FUNDING.yml └── requirements.txt
SYMBOL INDEX (2 symbols across 1 files) FILE: download_tweets.py function is_reply (line 16) | def is_reply(tweet): function download_tweets (line 36) | def download_tweets(
Condensed preview — 13 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (811K chars).
[
{
"path": "DNC_usernames.txt",
"chars": 226,
"preview": "SeemaNanda\nSpeakerPelosi\nSenSchumer\nRashidaTlaib\nAyannaPressley\nbrianschatz\nAOC\nIlhanMN\njoncoopertweets\nTheDemCoalition\n"
},
{
"path": "GOP_usernames.txt",
"chars": 233,
"preview": "realDonaldTrump\nIvankaTrump\nDonaldJTrumpJr\nJudgeJeanine\nparscale\nGOPLeader\nsenatemajldr\nAnnCoulter\nSenTedCruz\nGovMikeHuc"
},
{
"path": "LICENSE",
"chars": 1071,
"preview": "MIT License\n\nCopyright (c) 2019-2020 Max Woolf\n\nPermission is hereby granted, free of charge, to any person obtaining a "
},
{
"path": "README.md",
"chars": 4547,
"preview": "# download-tweets-ai-text-gen\n\nA small Python 3 script to download public Tweets from a given Twitter account into a for"
},
{
"path": "download_tweets.py",
"chars": 5112,
"preview": "import twint\nimport fire\nimport re\nimport csv\nfrom tqdm import tqdm\nimport logging\nfrom datetime import datetime\nfrom ti"
},
{
"path": "examples/JanelleCShane_355M.txt",
"chars": 190233,
"preview": "This is one of my favorite neural network sounds. Like a bellwether.\n====================\nI trained a neural network on "
},
{
"path": "examples/JanelleCShane_355M_2.txt",
"chars": 182708,
"preview": "Philips Hue and Yang only work on solids. Not opaque blobs. Microsoft Azure: \"Objects\"\nGoogle Cloud: \"Blobs\"\n==========="
},
{
"path": "examples/MagicRealismBot_355M.txt",
"chars": 97384,
"preview": "A toddler steals the end of the world and hides it inside a golden motel.\n====================\nA Zen teacher falls into "
},
{
"path": "examples/chrissyteigen_355M.txt",
"chars": 91450,
"preview": "Trying to get the perfect scrambled egg recipe from my blog post. A little goes a long way I'm sure.\n==================="
},
{
"path": "examples/elonmusk_355M.txt",
"chars": 134128,
"preview": "Falcon Heavy flight profile by end of year. 1000m, 1500m and 5000m. Night launch wnger\n====================\n£24,980 for "
},
{
"path": "examples/minimaxir_355M.txt",
"chars": 79209,
"preview": "but MS Word is the actual language of the computer science world\n====================\nI have one comment on the post its"
},
{
"path": "github/FUNDING.yml",
"chars": 731,
"preview": "# These are supported funding model platforms\n\ngithub: minimaxir # Replace with up to 4 GitHub Sponsors-enabled username"
},
{
"path": "requirements.txt",
"chars": 22,
"preview": "twint==2.1.4\nfire\ntqdm"
}
]
About this extraction
This page contains the full source code of the minimaxir/download-tweets-ai-text-gen GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 13 files (768.6 KB), approximately 173.6k tokens, and a symbol index with 2 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.