A Glimpse into the Past: What a 1930s-Era Trained AI Thinks the World Will Be Like in 2026
Meet 'Talkie', a 13B-parameter language model that paints a nostalgic picture of the future based on texts from a bygone era.

Imagine a world where the horrors of a second global conflict are yet to be realized, and the future seems bright with steamships chugging along the oceans and railroads crisscrossing the continents. This is the vision presented by 'Talkie', a 13B-parameter language model with a unique twist - it was trained exclusively on texts written before 1931. Talkie's creators have essentially given it a 'historical blind spot', limiting its training data to a time when the world was still reeling from the aftermath of World War I and the Roaring Twenties were in full swing.
As a result, the model's predictions for the future, specifically the year 2026, are infused with a nostalgic charm that seems almost quaint by today's standards. The model's worldview is one where steam-powered vessels still dominate the seas, and railroads continue to expand across the globe. Its vision of the future also includes a thriving literary scene, with penny novels and serialized stories captivating the imagination of the masses.
Perhaps most strikingly, Talkie expresses doubts about the likelihood of a second world war, a grim conflict that would go on to claim millions of lives and reshape the world order. The article that first appeared on The Decoder provides a fascinating glimpse into Talkie's anachronistic worldview, raising important questions about the role of historical context in shaping our understanding of the future. As AI models like Talkie continue to push the boundaries of what is possible with natural language processing, they also serve as a reminder of the importance of considering the limitations and biases inherent in their training data.
In the end, Talkie's vision of 2026 serves as a thought-provoking thought experiment, one that challenges us to consider how far we've come and how much we still have to learn from the past.
Source: The Decoder