@recentideas uses machine learning to learn how to tweet from recent tweets.

A model (second-order Markov chain) predicts the next word from the previous two. Twitter's tweets train the model using Hebbian learning.

To give more weight to recent tweets, periodically, weights are cut in half. When that happens, edges (transitions) whose weight fall below a threshold are removed. Then nodes (states) without incoming transitions are removed. This also conserves memory.

Periodically, the bot generates and posts a tweet containing

  • the total number of states and transitions (in base-36)
  • the sentence generated by following transitions of the highest weight
  • randomly-generated sentences using weights in the model (repeated until the length limit of 280 characters)