How to begin a song

lewisandquark:

So a while ago I made my first Twitter/Mastodon bot, a very simple little bot called TheSingalongBot. It came out of a chance conversation at the XOXO conference with Kate Compton and Gretchen McCullough, and all it does is toot/tweet/twoot the first line of a song. Humans can sing along if they want. (To date, SingalongBot followers have finished the ABC song, A British Tar is a Soaring Soul, When Will My Life Begin, and, heroically, The Saga Begins. They are also about 80% through 99 Bottles of Beer on the Wall).

To build up a suitable list of first lines for the bot, I started with a list of billboard top hits from the last 50 years. And then, because I was worried that the top hits might exclude some of the best songs, I asked people to submit the catchiest songs they knew. I got over 1,500 suggestions. Once I removed duplicates (”I threw a wish in the well” was submitted 14 times, followed closely by “Is this the real life?”) and combined with the billboard hits, I had 3,847 songs. The bot is going strong, and meanwhile I have this dataset.

Yes, of course I trained a neural net on it.

[Image from somewhere in BigGAN latent space, using ganbreeder.app]

Here’s how well it did at imitating the song lines after it had seen all 3,847 lines just once in training (that is, at the end of Epoch 1).

All I was a light the goont dood and och go dread
I want the stars a gind was the stord
I can’t a live the sake you hee
I want you ging you sand the rook on the shook
The sharly goold the bodey

By the end of Epoch 2 it could at least spell a bit better, though it made no actual sense.

I can a got a baby
I can day more to make you make you a pan
I was a little time it make a love a friend
When I don’t got a love
I see up a bearsting
Well I was a pan and I ever mise
You can a baby, down
Here you one and you are now

By Epoch 3 it has settled on the strategy of using “baby” in pretty much every song.

I don’t know the baby
We’re the baby baby
I’m gonna get up and love is to the sea
The baby all the trains
You see you wanna take the something
We’re not on your road
I don’t know when I’m gone and thing
My baby but this baby love your eyes
Looking down the baby for you

Epoch 4: experimenting a bit more with new words, including some that don’t actually exist. Grammar: questionable.

You can see your face
I don’t want to be a room
You think a beautiful saids
There all the true of the horse
You don’t never think the street
You can dang the be sunshine
You love me funny things
I don’t never love you
You never good the darn tonight
You know you want you
I was air, the end is the light
The time, I was a train
There’s a tring at the boom
I’m not my bright on the brains the feeling on my heart

Fast-forwarding a bit to Epoch 7, the neural net’s progress is a bit less obvious. At least we can mostly figure out what it is trying to say. Mostly.

I was a little beautiful star
I was a funny life with a little sing
Hearing about a brain
I was so long to be a little baby
I got my love, I got a put you in my heart
I want you to the thing
I got a single and the world so lonely
You took my heart and shipportow
I can’t get up in love
When I was a strangle, day
Spinnarduachy nobody man whiskey, die where it come
Staps the phone
Yo babe, very where you are frow
You shazry waiting and I’m girlin’

We arrive now at the end of the training, in Epoch 15 (so chosen because progress had pretty much stopped). The neural net has learned quite a bit, really.

It has learned to do love songs.

I see you standing in the hair
I can dance if you want my life
Oh no, my name is in the sky
Oh, I’m in love with your head in my eyes
You look at me like there’s a strungle
I see you and I’m home, and I see your face for me
I’m so tired of being so hearts
Yeah I’m always love you

And dance tunes.

Love, love my eyes rollin’
Ive got to Seem sexy
Hello when I came home from ass
Oh, baby, do you make her bass?
Welp, a-sunshane’ said, doot doot doot dam i gim. dumber

And folk songs.

When i get up in the mirror
This is the world that you’re a burning here
I can’t explain my window
Welp, the lack of the sunshine
You are my eyes are the boot
I see you standin’ on the mind
I said it up, the year was 1778

And whatever the heck these are.

The lights in the eventoot, heaven me
I had a peanut, ooooonce colod me back at the sing
Look down! Couse Rider schoode
After all waits that you’re feelinged
Fingertippolming
Muskrat, pen up forever for me
Hush, you’re funny time
I was childrentin’ 

Look out, songwriters! AI is coming for your jobs!

I also collected some songs generated by the most highly-trained algorithm, but at a higher creativity level so they’re really really weird. To get the list (and optionally bonus material every time I post), sign up here.

CNN headlines, according to a neural net

lewisandquark:

The world is a chaotic and confusing place. Could advanced artificial intelligence help us make sense of it? Well, possibly, except that today’s “artificial intelligences” are not exactly what you’d call sophisticated. With a couple of hundred virtual neurons (as opposed to 16 billion neurons in the human brain), the neural networks I work with can only do limited, narrow tasks. Can they digest a list of CNN headlines and predict plausible new headlines based on what they’ve seen? No, but it’s fun to watch them try.

Thanks to Rachel Metz and Heather Kelly of CNN Business, I had a list of 8,438 headlines that have appeared on CNN Business over the past year. And thanks to Max Woolf’s textgenrnn, I had an algorithm that could learn to imitate them. In most of my previous experiments I’ve let neural networks try to build words and phrases letter by letter, because I like the strange made-up words like “indescribbening” and “Anthlographychology”. But to give the neural net a better chance of making the headlines grammatical, I decided to have it use words as building blocks. It could skip learning to spell “panther” and “cryptocurrency” and focus on structure. It helped. Sort of.

Early on in the training, it kept generating headlines that were completely blank. This was either a very nihilistic view of world affairs, or its calculation that a space was the most likely (occasionally a headline would just be: “The”). If I told it to be very very daring, then it would finally use words other than “The” in the headlines, generating things like:

Instagram of Suddenly
Its iPhone Look it
Facebook Wind
11 Fake Tesla My People
Million do Regret
Supermarket Disney New Label Signature Company: Why
Cordray to the SpaceX Coal Administration Africa Jared Internet Big the Talks
to Pizza Videos

(I added the capitalization). After much more training (about 30 min total on a fast GPU), it grew confident enough to use actual words more often. It had learned something about business as well.

Why the Stock Market is Trying to Get a Lot of Money
The US China Trade War is so Middle Class
Bank of the Stock Market is Now Now the Biggest Ever
The Best Way to Avoid Your Money
How Much You Need to Know About the New York City
How to Make a New Tax Law for Your Boss
The Stock Market Market is the Most Powerful Money
Goldman Sachs is a New Super Bowl
Facebook is Buying a Big Big Deal
Why Apps in the Country
5 Ways to Trump on Chipotle Industry is the Random Wedding
Premarket Stocks Surge on Report of Philadelphia Starbucks Starbucks Starbucks

One curious pattern that emerged: companies behaving badly.

Walmart Grilled With a New Leader in Murder Tech
Coca-Cola is Scanning Your Messages for Big Chinese Tech
Amazon Wants to Make Money Broadcasting from Your Phone
Should I Pay My Workers
Amazon is Recalling 1 Trillion Jobs

My favorite headlines, though, were the most surreal.

Star Wars Episode IX Has New Lime Blazer
Mister Rogers in Washington
Black Panther Crushes the iPhone XS and XS Max Max
How to Build a Flying Car Car
You Make Doom Stocks
The Fly Species Came Back to Life
India Gets a Bad Mocktail Non Alcoholic Spirit
How to Buy a Nightmare

I talk a bit more about AI and creativity in this CNN article.

And that’s all the news for today – now it’s time for cake! Yes, I will now share with you two fine* recipes generated by a neural net trained solely on cake recipes: “Cargot Puddeki Wause Pound’s Favorite Ice Cream: Plant Tocha” and “Three Magee Coffee Cake”. To get them (and optionally, bonus material every time I post), sign up here.

Heirloom apples you’ve never tried, courtesy of AI

lewisandquark:

It’s apple season in the northern hemisphere! If you’re lucky enough to live in an apple-growing region, you may even be able to get ahold of heirloom apples. These older varieties have fallen out of favor, sometimes because their tree wasn’t robust enough, or they didn’t ship well. Sometimes you don’t find these heirlooms around because they are ugly as sin. Otherwise delicious! But they look like potatoes that were sat on by a bear, or cursed quest items that will transform you into a mushroom. The Apples of New York, published in 1905, lists thousands of heirloom apple varieties, and with these names as a starting point (I collected some modern varieties too, making about 2500 names), I trained a neural network (textgenrnn) to come up with more.

The neural network’s names sound, um… they don’t sound like modern apple varieties. In fact, they sound a lot like they should be riding horses and waving broadswords.

image

Lady Bold
Mage
Little Nonsy
Red The Braw
Lady Fallstone
Baldy the Pearmain
Spitzenborn
Warflush
Bogma
Red Tomm of Bonesey
Lady Of the Light
Kentic The Steeler
Warrior Golden Pippin Of Bellandfust

The reason, of course, is that most of the dataset is made of pre-1905 apple varieties, and those don’t follow your silly modern naming conventions, all Honey- this and Sweet- that. The Apples of New York lists varieties such as Pig Snout, Peasgood’s Nonesuch, Cornish Gilliflower, Mason’s Improved, Pine Stump, Dark Baldivin, Duck’s Bill, and Greasy Pippin. 

Still, even by “Peasgood’s Nonesuch” standards, some of the neural net names are strange.

image

Camfer’s Braeburg Yerky
Severe Pea
Golden Red Red 
Spug
Sten’s Ooter
Queen Screepser
Steep’s Red Balober
Kulter of Death Orga
Starley’s Non Pippe
Black Rollow
Galler’s Baldwilling
Bellewan’s Seedline
Evil Red Janet
Baldword
Kleevil
Svand’s Sheepser
Bramboney
Lady Basters
Winey De Wine
Cheekes 
Gala Wowgwarps Luber Reineautillenova

And there were apple names that were worse. Apples to definitely avoid. Perhaps part of the problem was that my neural net had previously been trained on metal bands.

image

Fall Of Apple 
Ruin Red Sweet 81
English Death Galebury 
Knington Pooper 
Naw
Grime 
Rot 
Brains
Hellbrawk
Double Non 
Winter Red Spite
White Wolves 
Winesour 
Ruinstrees 
Worsen Red
Failing Puster 
Excreme

And actually okay to get these last few I maaaay have thrown in a bit more training on metal bands:

image

Dark the Pippin of Merdill
Descend The Fujion Seedling
Beyond pell of Pippin
Spirite Hell Desert Belle
King Golden Steel
Ancient Bearing Rock
Graverella

There were apples that were worse, and I’m definitely blaming those on the metal bands (though I did catch one or two apple names in the original Apples of New York that would raise some eyebrows). If you want to read them (and, if you like, get bonus stuff with each post), enter your email and I’ll send them to you.

More! Tomatoes, and general fruits

College courses of the future, courtesy of a neural network

lewisandquark:

There are a lot of strange courses that make it into a college course catalog. What would artificial intelligence make of them?

I train machine learning programs called neural networks to try to imitate human things – human things they are absolutely are not prepared to understand. I’ve trained them to generate paint colors (Shy Bather or Stanky Bean, anyone?) and cat names (Mr. Tinkles is very affectionate) and even pie (please have a slice of Cromberry Yas). Could it have similar “success” at inventing new college courses?

UC San Diego’s Triton alumni magazine gave me UCSD’s entire course catalog, from “A Glimpse into Acting” to “Zionism and Post Zionism”, a few of which I recognized from when I was a grad student at UCSD. (Apparently I totally missed my opportunity to take “What the *#!?: An uncensored introduction to language”) I gave the course catalog to a neural network framework called textgenrnn which took a look at all the existing courses and tried its best to figure out how to make more like them.

image

It did come up with some intriguing courses. I’m not sure what these are, but I would at least read the course description.

Strange and Modern Biology
Marine Writing
General Almosts of Anthropology
Werestory
Deathchip Study
Advanced Smiling Equations
Genies and Engineering
Language of Circus Processing
Practicum Geology-Love
Electronics of Faces
Marine Structures
Devilogy
Psychology of Pictures in Archaeology
Melodic Studies in Collegine Mathematics

These next ones definitely sound as if they were written by a computer. Since this algorithm learns by example, any phrase, word, or even part of word that it sees repeatedly is likely to become one of its favorites. It knows that “istics” and “ing” both go at the end of words. But it doesn’t know which words, since it doesn’t know what words actually mean. It’s hard to tell if it’s trying to invent new college courses, or trying to make fun of them.

Advanced Computational Collegy
The Papering II
The Special Research
Introduction to Oceanies
Biologrative Studies
Professional Professional Pattering II
Every Methods
Introduction study to the Advanced Practices
Computer Programmic Mathematics of Paths
Paperistics Media I
Full Sciences
Chemistry of Chemistry
Internship to the Great
The Sciences of Prettyniss
Secrets Health
Survivery
Introduction to Economic Projects and Advanced Care and Station Amazies
Geophing and Braining
Marine Computational Secretites

It’s anyone’s guess what these next courses are, though, or what their prerequisites could possibly be. At least when you’re out looking for a job, you’ll be the only one with experience in programpineerstance.

Ancient Anthlographychology
Design and Equilitistry
The Boplecters
Numbling Hiss I
Advanced Indeptics and Techniques
Introduction in the Nano Care Practice of Planetical Stories
Ethemishing Health Analysis in Several Special Computer Plantinary III
Field Complexity in Computational Electrical Marketineering and Biology
Applechology: Media
The Conseminacy
The Sun Programpineerstance and Development
Egglish Computational Human Analysis
Advanced A World Globbilian Applications
Ethrography in Topics in the Chin Seminar
Seminar and Contemporary & Archase Acoa-Bloop African Computational for Project
Laboration and Market for Plun: Oceanography

Remember, artificial intelligence is the future! And without a strong background in Globbilian Applications, you’ll be left totally behind.

Just to see what would happen, I also did an experiment where I trained the neural net both on UCSD courses and on Dungeons and Dragons spells. The result was indeed quite weird. To read that set of courses (as well as optionally to get bonus material every time I post), enter your email here.

Your pumpkin spice experience, brought to you by AI

lewisandquark:

Here in the Northern hemisphere, there’s finally a chill in the air, bringing with it an avalanche of decorative gourds and a generous helping of pumpkin spice. Let’s see if an artificial neural network can get into the spirit of things.

Earlier, I trained a neural network to generate names of craft beers, thanks to Ryan Mandelbaum of Gizmodo, who inspired the project, and Andy Haraldson who extracted hundreds of thousands of beer names from BeerAdvocate.com. The beer names came in categories, and one of them, as it turns out, was “Pumpkin”. Now, clearly, is the time for this category. I added the beers from the “spice” and “winter warmers” category, making a total of 3584 beers, and I gave the list to a neural network to try to imitate.

image

(Beer labels generated via Grogtag.com)

Kill Ale
Alive Ale
Lemonic Beer
Warmer Hollow
La Spiced Fright Brew
Organic Mar And Doug
Strawbone Masher
Not Beer
Bog Porter
Pumpkin Pickle
Blood Barrel Beer
Stumpkin Ale
Santalion Winter Ale
Pumpkin Man
Gruppie’s Pampkin Belging Main Ale
Winter Winter This Dead Ale

The names came out rather spookier than I had expected. Sometimes that happens when I forget that the neural net had previously been trained on metal bands or diseases or something, but in this case, the previous dataset had been Neopets foods.

So, naturally, my next step was to train this neural network for just a little while – just long enough – on metal bands. Via transfer learning, I could get the neural net to apply some of its pumpkin spice knowledge to its new task of generatng metal bands. I just had to stop the training before catastrophic forgetting happened – that is, before the neural net forgot everything it knew about pumpkins and just went 100% metal. It took just a few seconds of training to turn the pumpkin spice ales just the right amount of metal. 

Operation: Spoopify was a success.

Secret Death Ale
Ale Gore
Pumpkin Winter Holes
Flesh Head
Spice Gore
Spice Prophecy
Dead Pumpkin Storm
Pumpkin Area
Child Shadow Ale
Dragon’s Winter Horse
Pumpkin Rotten Illusage
Man Spine I
Purpky Stumpkin
Pumpkin Imperial Sin
Skin Ale
Bleeding Ale
Winter Suul
Pumpkin Disaster
Grave Void

But what if I want a slightly different feel? Less gory, more uncanny? Nobody does uncanny like the podcast Welcome to Night Vale, in which ominous lights appear above the Arby’s and screaming voids are a routine road hazard. It turns out that a neural net with Night Vale transcripts in its training history will retain strong and haunting memories of this past for quite a while. So friends, Welcome to Night Vale Pumpkin Ale .

image

Faceless Ole Ale
[Head]
Oh Ale
Do I The Winter Face
Welcoming Ale
Hey God
Slacks.
Ginger Pull, Winking
Head The Secret Pumpkin
Pumpkin But Pumpkin and Oh But Pumpkin
Ale Human
OK?
I leaked the root like the heads
[BEEP]
Nothing Pumpkin Pumpkin Ale
I do need the news of The Guns
The Corrected Pumpkin Angel
Pumpkin’s Garfacksksknes

For the results of one more experiment in which I trained the neural net on the pumpkin ales plus Edgar Allen Poe’s “The Fall of the House of Usher” as well as the more, um “spicy” pumpkin ales, enter your email here. You can optionally get cool bonus material every time I post!

Neural net does sound effects

lewisandquark:

The BBC has published their entire archive of 16,000 sound effects, recorded over many decades and available for public use. Existing sound effects include “Wild boars having tea”, “4 batter puddings thrown”, “Several men snoring hilariously,” and “indisposed chicken,” along with lots of horses, engines, and clocks. (NOTE: as with most of the links in this article, sound will play as soon as you visit the link)

I’m not sure what the BBC intended people to use these sound effects for, but neural network enthusiasts immediately recognize a grand opportunity to make computers say silly things. Dave Lawrence downloaded the list of sound effect names and trained a neural network to invent new names. Read the results herehere, and here. Some of my favorites that Dave generated: 

Approach of piglets
Footsteps on ice, ponderous, parakeets
Fairgrounds: Ghost train ride with swords
Man punting metal hand
Waterfall into cattle bread

Unfortunately, we don’t know what these sound like, since it just generated the names of the effects. Now, it’s possible to train a neural net to generate entire new sounds, but I did something considerably simpler: I trained a text-only neural net to make up a new name, and then pick one of the 16,000 existing sounds to go with it. (link to dataset)

How well did it work? Well, the neural net did learn to choose valid existing sounds. I had to retrain it with a smaller, more interesting subset of the sound effects, because everything ended up being horses and heavy machinery. What you see below is a mix of results from both training runs. Click on the name of any of these, and it’ll play the sound the neural net thought should go with it. (Click on the number to find out the original name of the sound)

NOTE: sound will play as soon as you click the link.

07037122.wav Blinks.

07060061.wav 22 o’clock

07022197.wav German household operating.

07072020.wav Small small children 

07005132.wav Piglet country.

07042121.wav Telephone, with chickens

07005121.wav Birds on mixer.

07042045.wav Hound up.

07026006.wav Interior, four o’clock.

07045119.wav Household 2 man barks.

07005205.wav Agitated door cat, interior, chickens – 1972

07037347.wav Cars: 1980.

07037445.wav 1 woman walking (reprocessed)

When the neural net’s sound effect is weirder, it’s harder to say whether it’s right or not. I’ve never heard any of these. So… maybe?

07065152.wav Birds of thunder.

07037379.wav Sheep operating.

07042196.wav Horse o’clock.

07050158.wav London down – traffic closed.

07037496.wav Small man continuous large poop.

07005206.wav Gravel bears – 1967

07064036.wav Flying people, 10 men, interior, applause – 1984

07039214.wav Telephone women, individual mews.

07066034.wav Electric water.

07038073.wav Many Men and some thrown.

07023107.wav Horses singing

07065046.wav Person stork.

07037366.wav National Parrot Road.

07071022.wav Death Interior, exterior, diosel notes (reprocessed)

And I guess we’ll just have to trust the neural net on these.

07005073.wav Firewomadoellic Bear departing.

07002266.wav Horse hopping on bonged screet.

07045229.wav Dinghy passes away.

07032091.wav Infant of ground.

07032270.wav Warble Yarring hour.

07010012.wav Scoop chimestimes bling. (Stolling Ghorters)

07058161.wav Electric School train seven o’clock (saying crush.)

07023217.wav Steel sparrows activity & two machine work, suburb passing over machinery.

07068025.wav Flying rubber sea.

07065052.wav Peacock butter, with background clock children. 

07070107.wav Sixty Bubble Machine, 1967 

07071043.wav Sawing brain dumping on bus, bombs women run. 

07064008.wav Tempressed bow, rush of cows from machine with continuous singing

07039077.wav Lose Timber Machine of Button Transpoop opened

I also trained the neural net with the sound files and the names reversed – thus, I can finally ask it to pick a sound file to go with anything I want. Behold, long-standing mysteries solved by advanced artificial intelligence!

07042194.wav One hand clapping
07018034.wav Silence
07042215.wav T. Rex
07018033.wav The Beginning of the Universe
07005137.wav The music of the spheres

For some more of these (including the more PG-13 examples), as well as bonus material every time I post, join my mailing list!