neurodiversitysci:

dexer-von-dexer:

danshive:

In science fiction, AIs tend to malfunction due to some technicality of logic, such as that business with the laws of robotics and an AI reaching a dramatic, ironic conclusion.

Content regulation algorithms tell me that sci-fi authors are overly generous in these depictions.

“Why did cop bot arrest that nice elderly woman?”

“It insists she’s the mafia.”

“It thinks she’s in the mafia?”

“No. It thinks she’s an entire crime family. It filled out paperwork for multiple separate arrests after bringing her in.”

I have to comment on this because this is touching on something I see a lot of people (including Tumblr staff and everyone else who uses these kind of deep learning systems willy-nilly like this) don’t quite get: “Deep Reinforcement Learning” AI like these engage with reality in a fundamentally different way from humans. I see some people testing the algorithm and seeing where the “line” is, wondering whether it looks for things like color gradients, skin tone pixels, certain shapes, curves, or what have you. All of these attempts to understand the algorithm fail because there is nothing to understand. There is no line, because there is no logic. You will never be able to pin down the “criteria” the algorithm uses to identify content, because the algorithm does not use logic at all to identify anything, only raw statistical correlations on top of statistical correlations on top of statistical correlations. There is no thought, no analysis, no reasoning. It does all its tasks through sheer unconscious intuition. The neural network is a shambling sleepwalker. It is madness incarnate. It knows nothing of human concepts like reason. It will think granny is the mafia.

This is why a lot of people say AI are so dangerous. Not because they will one day wake up and be conscious and overthrow humanity, but that they (or at least this type of AI) are not and never will be conscious, and yet we’re relying on them to do things that require such human characteristics as logic and any sort of thought process whatsoever. Humans have a really bad tendency to anthropomorphize, and we’d like to think the AI is “making decisions” or “thinking,” but the truth is that what it’s doing is fundamentally different from either of those things. What we see as, say, a field of grass, a neural network may see as a bus stop. Not because there is actually a bus stop there, or that anything in the photo resembles a bus stop according to our understanding, but because the exact right pixels in the photo were shaded in the exact right way so that they just so happened to be statistically correlated with the arbitrary functions it created when it was repeatedly exposed to pictures of bus stops over and over. It doesn’t know what grass is, what a bus stop is, but it sure as hell will say with 99.999% certainty that one is in fact the other, for reasons you can’t understand, and will drive your automated bus off the road and into a ditch because of this undetectable statistical overlap. Because a few pixels were off in just the right way in just the right places and it got really, really confused for a second.

There, I even caught myself using the word “confused” to describe it. That’s not right, because “confused” is a human word. What’s happening with the AI is something we don’t have the language to describe.

Anyway what’s more, this sort of trickery can be mimicked. A human wouldn’t be able to figure it out, but another neural network can easily guess the statistical filters it uses to identify things and figure out how to alter images with some white noise in exactly the right way to make the algorithm think it’s actually something else. It’ll still look like the original image, just with some pixelated artifacts, but the algorithm will see it as something completely different. This is what’s known as a “single pixel attack.” I am fairly confident porn bot creators might end up cracking the content flagging algorithm and start putting up some weirdly pixelated porn anyway, and all of this will be in vain. All because Tumblr staff decided to rely on content moderation via slot machine.

TL;DR bots are illogical because they’re actually unknowable eldritch horrors made of spreadsheets and we don’t know how to stop them or how they got here, send help

This is such an accurate description of machine learning. Sadly, it’s also the best computational model we have of how babies learn words.

Tumblr recently clarified that nudity is acceptable in art, descriptions of breastfeeding and childbirth, and other non-porn uses. As they should. But don’t let that lull you into a false sense of security. They CAN’T keep their promise using machine learning alone – certainly not with crappy algorithms like “look for skin tones and curves.” Distinguishing porn from simple nudity is a somewhat subjective, culturally-based tasks that challenges smart humans. No set of statistical patterns, however sophisticated, can make that judgment.

How to begin a song

lewisandquark:

So a while ago I made my first Twitter/Mastodon bot, a very simple little bot called TheSingalongBot. It came out of a chance conversation at the XOXO conference with Kate Compton and Gretchen McCullough, and all it does is toot/tweet/twoot the first line of a song. Humans can sing along if they want. (To date, SingalongBot followers have finished the ABC song, A British Tar is a Soaring Soul, When Will My Life Begin, and, heroically, The Saga Begins. They are also about 80% through 99 Bottles of Beer on the Wall).

To build up a suitable list of first lines for the bot, I started with a list of billboard top hits from the last 50 years. And then, because I was worried that the top hits might exclude some of the best songs, I asked people to submit the catchiest songs they knew. I got over 1,500 suggestions. Once I removed duplicates (”I threw a wish in the well” was submitted 14 times, followed closely by “Is this the real life?”) and combined with the billboard hits, I had 3,847 songs. The bot is going strong, and meanwhile I have this dataset.

Yes, of course I trained a neural net on it.

[Image from somewhere in BigGAN latent space, using ganbreeder.app]

Here’s how well it did at imitating the song lines after it had seen all 3,847 lines just once in training (that is, at the end of Epoch 1).

All I was a light the goont dood and och go dread
I want the stars a gind was the stord
I can’t a live the sake you hee
I want you ging you sand the rook on the shook
The sharly goold the bodey

By the end of Epoch 2 it could at least spell a bit better, though it made no actual sense.

I can a got a baby
I can day more to make you make you a pan
I was a little time it make a love a friend
When I don’t got a love
I see up a bearsting
Well I was a pan and I ever mise
You can a baby, down
Here you one and you are now

By Epoch 3 it has settled on the strategy of using “baby” in pretty much every song.

I don’t know the baby
We’re the baby baby
I’m gonna get up and love is to the sea
The baby all the trains
You see you wanna take the something
We’re not on your road
I don’t know when I’m gone and thing
My baby but this baby love your eyes
Looking down the baby for you

Epoch 4: experimenting a bit more with new words, including some that don’t actually exist. Grammar: questionable.

You can see your face
I don’t want to be a room
You think a beautiful saids
There all the true of the horse
You don’t never think the street
You can dang the be sunshine
You love me funny things
I don’t never love you
You never good the darn tonight
You know you want you
I was air, the end is the light
The time, I was a train
There’s a tring at the boom
I’m not my bright on the brains the feeling on my heart

Fast-forwarding a bit to Epoch 7, the neural net’s progress is a bit less obvious. At least we can mostly figure out what it is trying to say. Mostly.

I was a little beautiful star
I was a funny life with a little sing
Hearing about a brain
I was so long to be a little baby
I got my love, I got a put you in my heart
I want you to the thing
I got a single and the world so lonely
You took my heart and shipportow
I can’t get up in love
When I was a strangle, day
Spinnarduachy nobody man whiskey, die where it come
Staps the phone
Yo babe, very where you are frow
You shazry waiting and I’m girlin’

We arrive now at the end of the training, in Epoch 15 (so chosen because progress had pretty much stopped). The neural net has learned quite a bit, really.

It has learned to do love songs.

I see you standing in the hair
I can dance if you want my life
Oh no, my name is in the sky
Oh, I’m in love with your head in my eyes
You look at me like there’s a strungle
I see you and I’m home, and I see your face for me
I’m so tired of being so hearts
Yeah I’m always love you

And dance tunes.

Love, love my eyes rollin’
Ive got to Seem sexy
Hello when I came home from ass
Oh, baby, do you make her bass?
Welp, a-sunshane’ said, doot doot doot dam i gim. dumber

And folk songs.

When i get up in the mirror
This is the world that you’re a burning here
I can’t explain my window
Welp, the lack of the sunshine
You are my eyes are the boot
I see you standin’ on the mind
I said it up, the year was 1778

And whatever the heck these are.

The lights in the eventoot, heaven me
I had a peanut, ooooonce colod me back at the sing
Look down! Couse Rider schoode
After all waits that you’re feelinged
Fingertippolming
Muskrat, pen up forever for me
Hush, you’re funny time
I was childrentin’ 

Look out, songwriters! AI is coming for your jobs!

I also collected some songs generated by the most highly-trained algorithm, but at a higher creativity level so they’re really really weird. To get the list (and optionally bonus material every time I post), sign up here.

New neural net snakes

gallusrostromegalus:

lewisandquark:

There’s a kind of neural network that learns to imitate whatever text you give it, whether that’s recipes, song lyrics, or even the names of guinea pigs.

Their imitations are often imperfect (they only know what’s in their dataset and therefore end up accidentally coming up with things that they don’t know are bad ideas). But one area where they tend to do well is inventing new species of things. The neural net’s birds were entirely believable, and its fish were generally no stranger than the species that already exist. So for my next project, I decided to generate some snakes.

I collected English common names for about 1,000 snakes and started training.

The first thing I noticed is that its snake names were a lot more noticeably fake than its birds or fish – the snake dataset is way smaller, so it had much fewer examples to learn from.

Tostlesnake
Sine cobra
Snoked snake
Cancan rattlesnake
Chippen’s putter python
Southern coat snake
Pinkwarm’s Copperanada
Smart sea snake
Western Nack
Blonded snake
Ham’s Pattlescops
Green tree nosh Snake
Hecker’s sea snake
Ned-scaled tree viper
Barned dater Snake
Smalle’s mock ractlesnake
Bland brown snake
Corned python
Common bust viper
Smorthead Garter Snake

Some snakes did approach the level of believability. You might be able to bluff some herpetologists into thinking these are real.

Many-nosed Snake
Cornhead snake
Arizona liger snake
Mangbow’s Earth Snake
Wing snake
Banded gutter snake
Jacucan balm snake
Banded guff adder
Bamboo tire snake
Rave hognose Snake
Tree-nosed adder
Bland-headed tree snake

Good luck with these, though.

image

Texan farter snake
Shite snake
Spitty rattlesnake
Thing snake
Brown brown Black Snake
Tamestail farter Snake
Black-neded tampon
Madeshine spite- racer
Bognia scat snake

I also decided to see what would happen if I trained a neural net both on snakes AND on Halloween costumes. Pleasingly, here are some of the snakes it came up with:

Wonder snake
Fairy rattlesnake
The Spacer Snake
Robo snake
Sexy cobra
Bob dog tree Snake

I had way too much fun generating those, and ended up generating more than would fit here. If you’d like to read the rest of them (and optionally get bonus material every time I post), enter your email here.

I’m particularly fond of the “Common Bust Snake” which sounds like it lives ina particularly rube goldberg “booby trap”

CNN headlines, according to a neural net

lewisandquark:

The world is a chaotic and confusing place. Could advanced artificial intelligence help us make sense of it? Well, possibly, except that today’s “artificial intelligences” are not exactly what you’d call sophisticated. With a couple of hundred virtual neurons (as opposed to 16 billion neurons in the human brain), the neural networks I work with can only do limited, narrow tasks. Can they digest a list of CNN headlines and predict plausible new headlines based on what they’ve seen? No, but it’s fun to watch them try.

Thanks to Rachel Metz and Heather Kelly of CNN Business, I had a list of 8,438 headlines that have appeared on CNN Business over the past year. And thanks to Max Woolf’s textgenrnn, I had an algorithm that could learn to imitate them. In most of my previous experiments I’ve let neural networks try to build words and phrases letter by letter, because I like the strange made-up words like “indescribbening” and “Anthlographychology”. But to give the neural net a better chance of making the headlines grammatical, I decided to have it use words as building blocks. It could skip learning to spell “panther” and “cryptocurrency” and focus on structure. It helped. Sort of.

Early on in the training, it kept generating headlines that were completely blank. This was either a very nihilistic view of world affairs, or its calculation that a space was the most likely (occasionally a headline would just be: “The”). If I told it to be very very daring, then it would finally use words other than “The” in the headlines, generating things like:

Instagram of Suddenly
Its iPhone Look it
Facebook Wind
11 Fake Tesla My People
Million do Regret
Supermarket Disney New Label Signature Company: Why
Cordray to the SpaceX Coal Administration Africa Jared Internet Big the Talks
to Pizza Videos

(I added the capitalization). After much more training (about 30 min total on a fast GPU), it grew confident enough to use actual words more often. It had learned something about business as well.

Why the Stock Market is Trying to Get a Lot of Money
The US China Trade War is so Middle Class
Bank of the Stock Market is Now Now the Biggest Ever
The Best Way to Avoid Your Money
How Much You Need to Know About the New York City
How to Make a New Tax Law for Your Boss
The Stock Market Market is the Most Powerful Money
Goldman Sachs is a New Super Bowl
Facebook is Buying a Big Big Deal
Why Apps in the Country
5 Ways to Trump on Chipotle Industry is the Random Wedding
Premarket Stocks Surge on Report of Philadelphia Starbucks Starbucks Starbucks

One curious pattern that emerged: companies behaving badly.

Walmart Grilled With a New Leader in Murder Tech
Coca-Cola is Scanning Your Messages for Big Chinese Tech
Amazon Wants to Make Money Broadcasting from Your Phone
Should I Pay My Workers
Amazon is Recalling 1 Trillion Jobs

My favorite headlines, though, were the most surreal.

Star Wars Episode IX Has New Lime Blazer
Mister Rogers in Washington
Black Panther Crushes the iPhone XS and XS Max Max
How to Build a Flying Car Car
You Make Doom Stocks
The Fly Species Came Back to Life
India Gets a Bad Mocktail Non Alcoholic Spirit
How to Buy a Nightmare

I talk a bit more about AI and creativity in this CNN article.

And that’s all the news for today – now it’s time for cake! Yes, I will now share with you two fine* recipes generated by a neural net trained solely on cake recipes: “Cargot Puddeki Wause Pound’s Favorite Ice Cream: Plant Tocha” and “Three Magee Coffee Cake”. To get them (and optionally, bonus material every time I post), sign up here.

codeman38:

It’s been well over a month since the last time that I posted a selection of surrealist neural-network-generated Wikipedia article titles, so here are 25 more of them selected from recent runs of the model:

  • Elector Dog Line
  • Hard Smith (actor)
  • Chicken Winter (movie)
  • Least Health Sweet Complex
  • St. Mess Martin (surname)
  • Joy In Bark (disambiguation)
  • Ether Percrachen (horse)
  • Letter-on-Married Wayne
  • Texas of the Motor (Ipan album)
  • James Priesto (Currency of Acid)
  • Times and Pearshape (disambiguation)
  • 324 New York Crack Fool Market
  • James Basketball (disumbian player)
  • Empire Hollowship of France
  • Abhammer’s Processive Campbell
  • Transformation of Charles Christmas
  • State Route 15 (Rail Compression, France)
  • Rock of Charles Ferry (Australian premier)
  • Kermington State Route 737 (Connecticopria album)
  • Lord in the Outline and Airlines of the Apparatesiana
  • The Crowdy State House of Winniplow Despection
  • The Croom of Society Airway
  • Sexing der del Anistal Goof
  • Butthological album, 1976
  • Barf Trophylot

Heirloom apples you’ve never tried, courtesy of AI

lewisandquark:

It’s apple season in the northern hemisphere! If you’re lucky enough to live in an apple-growing region, you may even be able to get ahold of heirloom apples. These older varieties have fallen out of favor, sometimes because their tree wasn’t robust enough, or they didn’t ship well. Sometimes you don’t find these heirlooms around because they are ugly as sin. Otherwise delicious! But they look like potatoes that were sat on by a bear, or cursed quest items that will transform you into a mushroom. The Apples of New York, published in 1905, lists thousands of heirloom apple varieties, and with these names as a starting point (I collected some modern varieties too, making about 2500 names), I trained a neural network (textgenrnn) to come up with more.

The neural network’s names sound, um… they don’t sound like modern apple varieties. In fact, they sound a lot like they should be riding horses and waving broadswords.

image

Lady Bold
Mage
Little Nonsy
Red The Braw
Lady Fallstone
Baldy the Pearmain
Spitzenborn
Warflush
Bogma
Red Tomm of Bonesey
Lady Of the Light
Kentic The Steeler
Warrior Golden Pippin Of Bellandfust

The reason, of course, is that most of the dataset is made of pre-1905 apple varieties, and those don’t follow your silly modern naming conventions, all Honey- this and Sweet- that. The Apples of New York lists varieties such as Pig Snout, Peasgood’s Nonesuch, Cornish Gilliflower, Mason’s Improved, Pine Stump, Dark Baldivin, Duck’s Bill, and Greasy Pippin. 

Still, even by “Peasgood’s Nonesuch” standards, some of the neural net names are strange.

image

Camfer’s Braeburg Yerky
Severe Pea
Golden Red Red 
Spug
Sten’s Ooter
Queen Screepser
Steep’s Red Balober
Kulter of Death Orga
Starley’s Non Pippe
Black Rollow
Galler’s Baldwilling
Bellewan’s Seedline
Evil Red Janet
Baldword
Kleevil
Svand’s Sheepser
Bramboney
Lady Basters
Winey De Wine
Cheekes 
Gala Wowgwarps Luber Reineautillenova

And there were apple names that were worse. Apples to definitely avoid. Perhaps part of the problem was that my neural net had previously been trained on metal bands.

image

Fall Of Apple 
Ruin Red Sweet 81
English Death Galebury 
Knington Pooper 
Naw
Grime 
Rot 
Brains
Hellbrawk
Double Non 
Winter Red Spite
White Wolves 
Winesour 
Ruinstrees 
Worsen Red
Failing Puster 
Excreme

And actually okay to get these last few I maaaay have thrown in a bit more training on metal bands:

image

Dark the Pippin of Merdill
Descend The Fujion Seedling
Beyond pell of Pippin
Spirite Hell Desert Belle
King Golden Steel
Ancient Bearing Rock
Graverella

There were apples that were worse, and I’m definitely blaming those on the metal bands (though I did catch one or two apple names in the original Apples of New York that would raise some eyebrows). If you want to read them (and, if you like, get bonus stuff with each post), enter your email and I’ll send them to you.

More! Tomatoes, and general fruits

College courses of the future, courtesy of a neural network

lewisandquark:

There are a lot of strange courses that make it into a college course catalog. What would artificial intelligence make of them?

I train machine learning programs called neural networks to try to imitate human things – human things they are absolutely are not prepared to understand. I’ve trained them to generate paint colors (Shy Bather or Stanky Bean, anyone?) and cat names (Mr. Tinkles is very affectionate) and even pie (please have a slice of Cromberry Yas). Could it have similar “success” at inventing new college courses?

UC San Diego’s Triton alumni magazine gave me UCSD’s entire course catalog, from “A Glimpse into Acting” to “Zionism and Post Zionism”, a few of which I recognized from when I was a grad student at UCSD. (Apparently I totally missed my opportunity to take “What the *#!?: An uncensored introduction to language”) I gave the course catalog to a neural network framework called textgenrnn which took a look at all the existing courses and tried its best to figure out how to make more like them.

image

It did come up with some intriguing courses. I’m not sure what these are, but I would at least read the course description.

Strange and Modern Biology
Marine Writing
General Almosts of Anthropology
Werestory
Deathchip Study
Advanced Smiling Equations
Genies and Engineering
Language of Circus Processing
Practicum Geology-Love
Electronics of Faces
Marine Structures
Devilogy
Psychology of Pictures in Archaeology
Melodic Studies in Collegine Mathematics

These next ones definitely sound as if they were written by a computer. Since this algorithm learns by example, any phrase, word, or even part of word that it sees repeatedly is likely to become one of its favorites. It knows that “istics” and “ing” both go at the end of words. But it doesn’t know which words, since it doesn’t know what words actually mean. It’s hard to tell if it’s trying to invent new college courses, or trying to make fun of them.

Advanced Computational Collegy
The Papering II
The Special Research
Introduction to Oceanies
Biologrative Studies
Professional Professional Pattering II
Every Methods
Introduction study to the Advanced Practices
Computer Programmic Mathematics of Paths
Paperistics Media I
Full Sciences
Chemistry of Chemistry
Internship to the Great
The Sciences of Prettyniss
Secrets Health
Survivery
Introduction to Economic Projects and Advanced Care and Station Amazies
Geophing and Braining
Marine Computational Secretites

It’s anyone’s guess what these next courses are, though, or what their prerequisites could possibly be. At least when you’re out looking for a job, you’ll be the only one with experience in programpineerstance.

Ancient Anthlographychology
Design and Equilitistry
The Boplecters
Numbling Hiss I
Advanced Indeptics and Techniques
Introduction in the Nano Care Practice of Planetical Stories
Ethemishing Health Analysis in Several Special Computer Plantinary III
Field Complexity in Computational Electrical Marketineering and Biology
Applechology: Media
The Conseminacy
The Sun Programpineerstance and Development
Egglish Computational Human Analysis
Advanced A World Globbilian Applications
Ethrography in Topics in the Chin Seminar
Seminar and Contemporary & Archase Acoa-Bloop African Computational for Project
Laboration and Market for Plun: Oceanography

Remember, artificial intelligence is the future! And without a strong background in Globbilian Applications, you’ll be left totally behind.

Just to see what would happen, I also did an experiment where I trained the neural net both on UCSD courses and on Dungeons and Dragons spells. The result was indeed quite weird. To read that set of courses (as well as optionally to get bonus material every time I post), enter your email here.

Your pumpkin spice experience, brought to you by AI

lewisandquark:

Here in the Northern hemisphere, there’s finally a chill in the air, bringing with it an avalanche of decorative gourds and a generous helping of pumpkin spice. Let’s see if an artificial neural network can get into the spirit of things.

Earlier, I trained a neural network to generate names of craft beers, thanks to Ryan Mandelbaum of Gizmodo, who inspired the project, and Andy Haraldson who extracted hundreds of thousands of beer names from BeerAdvocate.com. The beer names came in categories, and one of them, as it turns out, was “Pumpkin”. Now, clearly, is the time for this category. I added the beers from the “spice” and “winter warmers” category, making a total of 3584 beers, and I gave the list to a neural network to try to imitate.

image

(Beer labels generated via Grogtag.com)

Kill Ale
Alive Ale
Lemonic Beer
Warmer Hollow
La Spiced Fright Brew
Organic Mar And Doug
Strawbone Masher
Not Beer
Bog Porter
Pumpkin Pickle
Blood Barrel Beer
Stumpkin Ale
Santalion Winter Ale
Pumpkin Man
Gruppie’s Pampkin Belging Main Ale
Winter Winter This Dead Ale

The names came out rather spookier than I had expected. Sometimes that happens when I forget that the neural net had previously been trained on metal bands or diseases or something, but in this case, the previous dataset had been Neopets foods.

So, naturally, my next step was to train this neural network for just a little while – just long enough – on metal bands. Via transfer learning, I could get the neural net to apply some of its pumpkin spice knowledge to its new task of generatng metal bands. I just had to stop the training before catastrophic forgetting happened – that is, before the neural net forgot everything it knew about pumpkins and just went 100% metal. It took just a few seconds of training to turn the pumpkin spice ales just the right amount of metal. 

Operation: Spoopify was a success.

Secret Death Ale
Ale Gore
Pumpkin Winter Holes
Flesh Head
Spice Gore
Spice Prophecy
Dead Pumpkin Storm
Pumpkin Area
Child Shadow Ale
Dragon’s Winter Horse
Pumpkin Rotten Illusage
Man Spine I
Purpky Stumpkin
Pumpkin Imperial Sin
Skin Ale
Bleeding Ale
Winter Suul
Pumpkin Disaster
Grave Void

But what if I want a slightly different feel? Less gory, more uncanny? Nobody does uncanny like the podcast Welcome to Night Vale, in which ominous lights appear above the Arby’s and screaming voids are a routine road hazard. It turns out that a neural net with Night Vale transcripts in its training history will retain strong and haunting memories of this past for quite a while. So friends, Welcome to Night Vale Pumpkin Ale .

image

Faceless Ole Ale
[Head]
Oh Ale
Do I The Winter Face
Welcoming Ale
Hey God
Slacks.
Ginger Pull, Winking
Head The Secret Pumpkin
Pumpkin But Pumpkin and Oh But Pumpkin
Ale Human
OK?
I leaked the root like the heads
[BEEP]
Nothing Pumpkin Pumpkin Ale
I do need the news of The Guns
The Corrected Pumpkin Angel
Pumpkin’s Garfacksksknes

For the results of one more experiment in which I trained the neural net on the pumpkin ales plus Edgar Allen Poe’s “The Fall of the House of Usher” as well as the more, um “spicy” pumpkin ales, enter your email here. You can optionally get cool bonus material every time I post!

Perfectly normal shopping malls, named by neural network

lewisandquark:

The names of American shopping malls are a carefully calculated combination of bland and grandiose. Even the plainest of strip malls will have a faded sign somewhere proclaiming it to be the “Westbrook Manor Shoppes at Town Center Mall” or something of that nature. What happens if a machine learning algorithm tries to imitate this?

Thanks to Keith Wezwick I had a dataset of 1,106 existing shopping malls – a smallish dataset but one with enough consistency that I thought a neural net might be able to get the hang of it. I gave the dataset to char-rnn, a type of character-level recurrent neural network. Unlike some other neural networks I’ve used, this one starts from scratch – when it has its first look at the dataset, its neurons are connected randomly, with no built-in knowledge of any other datasets or even of English.

After a few passes through the dataset, it has learned to use letters and spaces, and even has learned some of the most common words. You can probably tell these are supposed to be shopping centers. You can also probably tell that there’s something terribly wrong with them.

Rre Gostge
Toreson Shoppiol Trape Center
The Shopp Mall
Preen Center
CoKies Mall
Shoppin Stophend
8!oon Center
Wastfield Stopas Center
Lieemsoo ah Tre Stops Mall
Woller Vallery
Baspoon Towne Center
Cowpe Toeoe Center Lrnme Cherry Center Warleros Oewves Mall

(To find out what these malls looked like, I asked AttnGAN, an algorithm trained to generate an image to go with any phrase)

But after more training, the mall-naming algorithm got… a bit better. By the time it had looked through the list of malls about 13 times, it was reproducing some malls word-for-word. I didn’t really intend for it to plagiarize malls verbatim from its input data, but the problem is I had told it to produce more malls like the ones it saw, and as far as it’s concerned, a perfect solution is to copy them. (This problem is called overfitting, and shows up in all sorts of annoying ways in machine learning research.) It did produce original malls too, though, and its original malls were definitely noticeable as neural net creations.

Bointy Mall
Fall of Lruin Mall
Princer Mall
Gollfop Mall 
East Bointy Mall
North Drain Mall
Town Center at Citylands
Galleria Shrps at Santa Mariatun
Outlets of the Source Mall
Peachdate Mall
Willowser Pork Mall
Mall of testland Mall

So the mall-generating neural net never quite got out of the “definitely not a real mall” territory. Could they get even more unsettling? The answer is, delightfully, yes. Here’s the output from a neural net (textgenrnn, this time), that was trained on the shopping mall dataset, but only after it was trained on transcripts from the spooky podcast Welcome to Night Vale. In Night Vale, every conspiracy theory is true, and deadly figures haunting the dog park, or mysterious glowing clouds, are just part of everyday life. Night Vale has a mall. It’s called “Night Vale Mall.” Seeing as it has in the past suffered outbreaks of deadly poison gas, even deadlier Valentine’s Day cards, and some kind of screaming vortex in the food court (and we don’t even know why East Night Vale Mall is now disused), it is just possible that Night Vale may be needing to name a new mall sometime in the near future. Perhaps one of these names will be suitable.

Burning Park Mall
Person Shell
The Shape
All Owl Mall
Place
Square Mall
Complete store of Mall
The What is Mall Mall
Many Head Mall
Mall Glow Place
Chanting Place
South Unit Presence
This is Center Mall
Goodnight Mall
Mall Pill Press Office Blood Park Mall
Carlous Preferse was all danger the Shoppendatoland Burning Shaper Mall

For more unsettling shopping malls (including one adults-only mall), as well as bonus material each time I post, enter your email here. It’s perfectly safe. Probably. Just stay away from the South Unit Presence.

Neural net does sound effects

lewisandquark:

The BBC has published their entire archive of 16,000 sound effects, recorded over many decades and available for public use. Existing sound effects include “Wild boars having tea”, “4 batter puddings thrown”, “Several men snoring hilariously,” and “indisposed chicken,” along with lots of horses, engines, and clocks. (NOTE: as with most of the links in this article, sound will play as soon as you visit the link)

I’m not sure what the BBC intended people to use these sound effects for, but neural network enthusiasts immediately recognize a grand opportunity to make computers say silly things. Dave Lawrence downloaded the list of sound effect names and trained a neural network to invent new names. Read the results herehere, and here. Some of my favorites that Dave generated: 

Approach of piglets
Footsteps on ice, ponderous, parakeets
Fairgrounds: Ghost train ride with swords
Man punting metal hand
Waterfall into cattle bread

Unfortunately, we don’t know what these sound like, since it just generated the names of the effects. Now, it’s possible to train a neural net to generate entire new sounds, but I did something considerably simpler: I trained a text-only neural net to make up a new name, and then pick one of the 16,000 existing sounds to go with it. (link to dataset)

How well did it work? Well, the neural net did learn to choose valid existing sounds. I had to retrain it with a smaller, more interesting subset of the sound effects, because everything ended up being horses and heavy machinery. What you see below is a mix of results from both training runs. Click on the name of any of these, and it’ll play the sound the neural net thought should go with it. (Click on the number to find out the original name of the sound)

NOTE: sound will play as soon as you click the link.

07037122.wav Blinks.

07060061.wav 22 o’clock

07022197.wav German household operating.

07072020.wav Small small children 

07005132.wav Piglet country.

07042121.wav Telephone, with chickens

07005121.wav Birds on mixer.

07042045.wav Hound up.

07026006.wav Interior, four o’clock.

07045119.wav Household 2 man barks.

07005205.wav Agitated door cat, interior, chickens – 1972

07037347.wav Cars: 1980.

07037445.wav 1 woman walking (reprocessed)

When the neural net’s sound effect is weirder, it’s harder to say whether it’s right or not. I’ve never heard any of these. So… maybe?

07065152.wav Birds of thunder.

07037379.wav Sheep operating.

07042196.wav Horse o’clock.

07050158.wav London down – traffic closed.

07037496.wav Small man continuous large poop.

07005206.wav Gravel bears – 1967

07064036.wav Flying people, 10 men, interior, applause – 1984

07039214.wav Telephone women, individual mews.

07066034.wav Electric water.

07038073.wav Many Men and some thrown.

07023107.wav Horses singing

07065046.wav Person stork.

07037366.wav National Parrot Road.

07071022.wav Death Interior, exterior, diosel notes (reprocessed)

And I guess we’ll just have to trust the neural net on these.

07005073.wav Firewomadoellic Bear departing.

07002266.wav Horse hopping on bonged screet.

07045229.wav Dinghy passes away.

07032091.wav Infant of ground.

07032270.wav Warble Yarring hour.

07010012.wav Scoop chimestimes bling. (Stolling Ghorters)

07058161.wav Electric School train seven o’clock (saying crush.)

07023217.wav Steel sparrows activity & two machine work, suburb passing over machinery.

07068025.wav Flying rubber sea.

07065052.wav Peacock butter, with background clock children. 

07070107.wav Sixty Bubble Machine, 1967 

07071043.wav Sawing brain dumping on bus, bombs women run. 

07064008.wav Tempressed bow, rush of cows from machine with continuous singing

07039077.wav Lose Timber Machine of Button Transpoop opened

I also trained the neural net with the sound files and the names reversed – thus, I can finally ask it to pick a sound file to go with anything I want. Behold, long-standing mysteries solved by advanced artificial intelligence!

07042194.wav One hand clapping
07018034.wav Silence
07042215.wav T. Rex
07018033.wav The Beginning of the Universe
07005137.wav The music of the spheres

For some more of these (including the more PG-13 examples), as well as bonus material every time I post, join my mailing list!