United States v. Five Hundred and Twelve Otters

The first group of defendants is arraigned (image: Brocken Inaglory via Wikipedia, CC 3.0)

In the past we’ve discussed the marvelous phenomenon of the in rem lawsuit, in which a government trying to obtain rights to something files a lawsuit “against” the property it’s trying to seize. Well, it’s not so much the lawsuits that are marvelous, but the case names that result from this practice. See, e.g., United States v. 1855.6 Pounds of American Paddlefish Meat,” Lowering the Bar (Nov. 14, 2018) (discussing the case by that name, No. 4:18-cv-00207-SEB-DML (S.D. Ind. filed Nov. 13, 2018)). Other examples can be found, of course, on the Comical Case Names page.

The example in the headline, United States v. Five Hundred and Twelve Otters, isn’t a real case, but it easily could be. Several months ago, I think pre-pandemic but who remembers that far back, several people sent me a post by Janelle Shane, a computer scientist who writes about artificial intelligence “and the sometimes hilarious, sometimes unsettling ways that algorithms get things wrong.” One of her pastimes is training “neural networks” with text that human beings have written and then asking the AIs to generate new text of the same kind. The results often show that, like the Trump legal team, they’re playing in the right general ballpark but clearly don’t understand what they’re doing. For example, feeding one AI the names of candle scents like “Mango Vanilla,” “Kiwi Bourbon,” and the truly horrifying “Christmasly Spiced Apple” (those were actually created by humans) caused the neural net to generate names like “Baggy Air,” “Friendly Wetsuit,” and “Freshly Scrutinized Orange.” I would absolutely buy a “Friendly Wetsuit” scented candle, though I would under no circumstances light it.

This approach, of course, is perfect for generating new in rem case names, which is the subject of the post to which readers alerted me. Janelle gave the neural net a list of case names, including a couple that are also on my list (United States v. Approximately 64,695 Pounds of Shark Fins and South Dakota v. Fifteen Impounded Cats), and then asked it to generate its own. This yielded some superb names, including:

  • United States v. Five Hundred and Twelve Otters
  • South Dakota v. One Bobcat
  • France v. Three Odor-Producing Insects
  • Texas v. One Small Dog With a Napkin Near It
  • United States v. An Overcrowded Hole in the Ground
  • South Dakota v. Apparition at a Shoe Store

and

  • United States v. One Man With Numerous Footwide Gibberellic Organs

Again, these machines get the general idea, but then things get … a little weird. (I thought this one made up the word “gibberellic,” but it turns out that a “gibberellin” is a kind of plant hormone, and “gibberellic acid” is a food additive. See 21 C.F.R. § 172.725.) On the other hand, the ones that humans come up with can also be strange. Only in some cases could you distinguish between the human and machine creations, mainly because in rem litigation would have to concern something tangible. There would be no point in suing a hole in the ground, for example, no matter how crowded, or in suing an apparition at a shoe store.

Although I could definitely see the Trump legal team doing those things, if it hasn’t already.

Janelle has written a book on AI, called You Look Like a Thing and I Love You: How Artificial Intelligence Works and Why It’s Making the World a Weirder Place. If you buy it by clicking on that link, Janelle will make a couple of bucks, I will make a few cents, and Jeff Bezos will become richer by some infinitesimal percentage of his current wealth.

The otters will receive nothing.