Predictions for 2050

Erik Hoel makes a list of predictions for 2050.

This may seem like the far-flung future, but as Hoel points out, it’s only 28 years away. Making predictions for 2050 based on what we see today is just like sitting in the early ‘90s and predicting what the world will look like in the 2020s.

Hoel makes his predictions based on a simple insight: change is incremental, and the minor trends of today are the institutional changes of tomorrow. If you want to know what 2050 will look like, think about the nascent trends of the early 2020s and project them into the future:

If you want to predict the future accurately, you should be an incrementalist and accept that human nature doesn’t change along most axes. … To see what I mean more specifically: 2050, that super futuristic year, is only 29 years out, so it is exactly the same as predicting what the world would look like today back in 1992. … what was most impactful from 1992 were technologies or trends already in their nascent phases, and it was simply a matter of choosing what to extrapolate. For instance, cellular phones, personal computers, and the internet all existed back in 1992, although in comparatively inchoate stages of development. … The central social and political ideas of our culture were established in the 1960s and 70s and took a slow half-century to climb from obscure academic monographs to Super Bowl ads. So here are my predictions for 2050. They are all based on current trends.

We think this approach is really smart. In fact, we like it so much that we wanted to take it for a test drive. In this post, we make our own set of predictions for 2050, using Hoel’s method of picking out trends that we suspect will go on to shape the 2020s, 2030s, and 2040s.

Projects are more fun when you do them with friends, so we invited a bunch of other bloggers to make their own predictions for 2050, using the same approach of extrapolating trends that they think are important today. So far we have predictions from:

Here at SMTM, we’re going to add something to Hoel’s original method of extrapolating “technologies or trends already in their nascent phases”: regression to the mean. What we mean by this is, well — the 20th century was very unusual in many different ways.

A lot of things that we take for granted are really, really new — like 401ks (invented in 1978), Traditional (1974) and Roth (1997) IRAs, and modern credit scores (1989). Indexes like the Dow Jones and the S&P 500 run back several decades, but index funds that track them only appeared in 1972. In 1940, only 5% of US adults over 25 had a college degree and only 25% had a high school diploma. Even income tax wasn’t a permanent part of the US tax system until 1913 — we had to do a whole amendment to the Constitution to make it happen.

Some of these may be here to stay, but looking back from 2050, a lot of 20th century “institutions” will look like a flash in the pan. The trends that are holding will probably hold, but any 20th century abnormalities that seem to be reversing are likely to go back to the way they were for most of human history. A nascent trend that looks like regression to the historical mean is much more likely to be a trend that will continue on to 2050.

Hoel’s Predictions

We agree with a lot of Hoel’s predictions. A Martian colony, or crewed missions to Mars at least, are looking pretty likely as the price of space travel drops (and he’s not the only one predicting this). We’re also reminded of the recent increasing interest in charter cities and Georgism — Mars would be a great location for your wacky new city and it’s the closest we’re going to get to making all-new land, at least any time soon. 

Hoel is clearly right that we will move away from stores, but this might also look like more business done out of people’s homes (like was done historically) or like more business done in something like a marketplace with semi-permanent stalls (like was done historically). 

Genetic engineering of embryos to avoid disease is already being done and it does seem like this will happen more and more. Similarly, anti-aging technology is already here and will just keep getting better and cheaper, especially given that Peter Thiel is involved. But this is sort of hard to square with Hoel’s final prediction, that 2050 will be “the winter of my life”, at the age of only 62. It seems a little pessimistic on Hoel’s part. Didn’t you hear that in 2050, 62 will be the new 25?  

Sometimes we agree with the general picture, but not with the details. Education will indeed be mostly online (again, it already is), but it will look more radical than what Hoel imagines. The real education giant today is not Harvard, or even MITx, but YouTube, and we will see more of THAT in the future. 

Hoel is right that AI will be impactful in day-to-day life, but we think this is true only in the obvious ways. You will still have Siri and Alexa, but you won’t have Data, or even Bender. We might have better image classifiers and even decent chatbots. Strong AI may be a possibility by 2050 (a topic for another time), but by the “extrapolate the future from current trends” technique, in 2050 many classifiers will still have a hard time telling the difference between a dragonfly and a bus. GPT-29 will be able to churn out a movie script as formulaic as that of the average Hollywood scriptwriter (and may well replace them) but it won’t be replacing writing that requires anything as complicated as “themes” or “meaning”. 

Hoel predicts wild changes in family structure — specifically, the decline of the family and the rise of the throuple. We agree family dynamics will change, but again we disagree on the specifics. More on this in a minute.

A few predictions we disagree with in general. Hoel predicts that the online mob will create an endless culture war, and that “the future really is female”. But the current culture war is amusingly soft compared to many cultural conflicts in living memory, and the fact that women get a majority of all degrees means very little when you believe that university degrees are worth less and less all the time!

Finally, we disagree that people and culture will become boring. Thanks for reading a pseudonymous mad science blog called SLIME MOLD TIME MOLD

SMTM Predictions

Robotics

This first one is less an original prediction than an elaboration on Hoel, who says: 

Buzzing drones of all shapes and sizes will be common in the sky (last year Amazon won FAA approval for its delivery drone service, opening the door for this). Small robots will be everywhere, roving the streets in urban areas, mostly doing deliveries.

We agree. Robots will stay dumb but you will see a lot of them, possibly in delivery. It’s hard to look at work from Boston Dynamics and not expect that in 30 years we’ll have lots of quadrupedal robots trotting around our streets, carrying goods and generally acting as porters, footmen, and stevedores. 

If you get the price point low enough, small robots might even replace backpacks and suitcases. Boston Dynamics’ robot Spot currently costs $74,500, but 30 years of R&D can do a lot. Let’s take computers — in the early 90s, 1 GB of storage cost about $10,000. But these days you can get a 2 TB drive for about $50, which puts a 1 GB at only a few cents. If the same thing happened to Spot, it would cost less than a dollar. We don’t expect anything this drastic, but similar forces could turn quadrupedal robots into household goods. Our bet is on robotic palanquins.

The Witch of the Waste: robotics thought leader

It also seems very likely that with 30 more years of R&D, we’ll have ironed out all the last problems with self-driving cars, so expect that kind of robot as well.

More Infectious Disease

For most of human history, infectious disease was a fact of life. As in so many things, the 20th century was an aberration. We developed antibiotics, improved hygiene, even eliminated some diseases altogether. But this pleasant moment in the sun is over. Someone writing in December 2019 might be forgiven for thinking that with our medical knowledge and scientific might, we could defeat any disease that might rise up. But evidently not.

This XKCD from 2015 aged kind of poorly. explainxkcd.com even says, “at the time of writing it was not readily apparent that the old dog still has some teeth”

This means more pandemics. Many will become endemic, as will probably happen with COVID. Some existing diseases will become resistant to our best antibiotics. If we’re really unlucky, we will see the return of smallpox or some horrible mystery plague released by the thawing permafrost. (Hoel is also concerned about this.) 

We still have germ theory, so we won’t be sent back to the state of things in 1854. But the future will look more like the past, and we’ll have to start paying attention to disease in the way our ancestors did. As historian Ada Palmer describes, “I have never read a full set of Renaissance letters which didn’t mention plague outbreaks and plague deaths, and Renaissance letters from mothers to their traveling sons regularly include, along with advice on etiquette and eating enough fennel, a list of which towns to avoid this season because there’s plague there.” Embrace tradition with this delicious recipe for Fenkel in soppes

Citizen Research

These days, big universities and medical centers and stuff are responsible for most research. But this is a big deviation from the historical norm. In the past, random haberdashers and architects and patent clerks and high school teachers, or just rich people with too much time on their hands, were the ones doing most of the cutting-edge research. 

There are already many signs of regression to the mean on this. Anonymous 4channers are publishing proofs to longstanding superpermutation problems on anime boards. The blog Astral Codex Ten (and predecessor blog Slate Star Codex, by the same author) publishes major reviews (“much more than you wanted to know”) on a wide variety of topics — disease seasonality, links between autism and intelligence, melatonin, you name it. Sometimes he even does empirical work — case studies on the effect of CO2 on cognition, large nootropics usage surveys, studies of SSRI usage, etc.

Pseudonymous internet besserwisser Gwern writes long articles on everything from Gaussian expected maximums to generating anime faces with neural networks. Wikipedia, the largest and most-read reference work in history, is written entirely by volunteers. And of course there’s us, Slime Mold Time Mold, creating a book-length original work where we argue for a new theory of the obesity epidemic. 

This is only going to speed up. The 2020s will see a lot more research from people who aren’t in the academy, and by 2050, most of the best scholarship will be done by laypeople.

Elective Chemistry

At some point in the near future, the trends of plastic surgery, nootropics, psychedelic legalization, trans hormone therapy, and bodybuilding will collide, with spectacular results. 

Doing things to reshape your body and mind is an idea as old as dirt, but with recent advances in technology, and breakdowns in cultural taboos, the practice of what could be called “elective chemistry” is going to take off, probably in the next 10 or 20 years. 

Why let nature be the only one who has any say over the chemicals affecting your mind and body? It’s already common to use caffeine, alcohol, and tobacco to reshape your mind. If you’re willing to go out of your way, you can get a psychiatrist to prescribe any number of mind-altering chemicals, and many people today are on lexapro or modafinil or adderall or wellbutrin full-time. And while this is easy enough to do legally, it’s even easier outside the law — many people use psychedelics, steroids, or hormone therapies illegally, to change their minds or bodies as they see fit.

This won’t just become more acceptable for people on the margins of society, it will become mainstream. Cis people are already the largest consumers of hormone therapy and other medical procedures normally assocaited with trans healthcare (largely because of base rates, but even so). Cis men sometimes go on androgen replacement therapy as they age, and cis women often go on hormone replacement therapy after menopause, which sometimes includes testosterone. And it’s equally easy to use them as mind-altering substances, since they have psychological effects as well as physical ones.

Working out, getting plastic surgery, and taking steroids or hormones are all just forms of body modification. We’ve already come to accept piercings and tattoos, to the point where they’re practically boring. In the near future, most forms of body modification will be unremarkable, in the literal sense that you cannot be bothered to remark on them.  

(This may be extended even further by the development of better prosthetics, like the extra thumb or connecting your brain directly to social media — wait that last one seems like a bad idea.)

Europe will become less important, regional politics more important, and general de-globalization

Europe was a technological and cultural backwater for most of history. Then, in the 16th century, Europe began a period of explosive growth and development, sometimes called the Great Divergence. There’s a lot of interesting debate as to why this happened, but it definitely did happen. 

It was also definitely a historical anomaly, and there are already signs of things going back to the way they were. There was a crunch in favor of Europa and her direct offshoots up to the middle of the 20th century, but since 1950 things have been turning around:

The fastest-growing economies in the world are all countries like Bangladesh, Ethiopia, Vietnam, Turkey, and Iran. Brazil is already the 13th-largest economy in the world, Indonesia the 16th, and Nigeria the 27th — all ahead of countries like Ireland (29th), Norway (31st), Denmark (37th), and Portugal (49th). It’s hard to predict who the big winners will be, but it’s clear that Europe will become less and less important, as countries in the rest of the world become major powers.

As wealth and power gets more distributed, supply chains will get shorter and less global. Measures of globalization used to increase year after year, but they sputtered in the financial crash of 2008 and never really recovered. COVID has provided another shock, a disruption that is far from over. There isn’t really a trend away from globalization yet, but the trend in favor of globalization has definitely stalled. 

There may also be regression to the mean in protectionism. Historically, many states have supported themselves largely through tariffs (see e.g. for the US), and protectionism may be good for growing economies. If globalization really has stalled for the long-term, and certainly if it starts to reverse, we may see more and more tariffs, even a shift in how governments fund themselves. Russia and India have already begun taking steps in that direction, and other countries may follow.

Non-nuclear families

Historically, most people lived in large extended families. The nuclear family, at least as we know it today, is largely an artifact of the unusual circumstances of the 20th century. As income inequality and the cost of buying a home increase, more people will live in large groups — be that group houses, “adult dorms”, or multigenerational homes. COVID has accelerated this trend. More young adults (18 to 29) are living at home now than they were at any point since 1900. The future doesn’t look like Leave it to Beaver, or even The Simpsons

Part of this will be transitioning back to a system where familial wealth is more important than personal wealth. Historically if your family disowned you, you were screwed. This is why a mainstay of 19th century literature is killing your brother for an inheritance.

And as much as the “kill your brother for the inheritance” thing was a pattern of the upper classes, familial wealth was more important than personal wealth even for peasants (though for peasants, it was sort of more communal wealth than familial).

This is why we agree with Hoel’s prediction of major changes in family structure. We agree that “normal families” are on their way out. But we disagree on nearly all the specifics. We don’t expect to see lots of single-parent homes — we expect more multi-generational homes, group homes, or other arrangements, with lots of adults co-raising children. See e.g. Kelsey Piper’s experience, her main conclusions being “I have no idea how people with two parent households manage” and “I wish we had even more breastfeeding parents”. Put that on a t-shirt: Even More Breastfeeding Parents by 2050.

And instead of seeing a rise in throuples, we expect to see a return of that very old-fashioned arrangement, the Harem — where a person of means has multiple wives, one wife and multiple concubines, etc. etc. 

Wage labor becomes less common 

Tying yourself to a major employer is still the norm today, but this is changing. Some people will be paid on retainer (i.e. a salary), and some jobs where you really are being paid for your time (e.g. security guard) will still be hourly, but more and more people will be paid to complete specific tasks or deliver a particular result, with no questions asked as to how fast they did it.  

We think the gig economy is coming for the rest of the marketplace, but instead of everything being chopped up into little tasks and ruled by corporations (à la Uber), we expect it to look like more contract workers and fewer full-time employees. More people will be self-employed, or will form small companies to deliver goods or services on demand. 

We expect this is (mostly) a good thing. People benefit from being their own boss and being able to do the work however they want, as long as they get it done. Being paid to stand around and look busy isn’t good for anyone. 

To a historical person, wage labor would be one of the strangest things about the modern world, and the idea of a steady job with benefits would be even stranger. Most people were farmers and almost never had any reason to handle money. Even if you were a potter or a blacksmith, you were paid for your product, the actual bowl or knife you were selling, not for your labor or the hours you worked.

Antibodies to the Outrage Economy

Once upon a time, clickbait was a major annoyance, but it was mostly a problem because people fell for it. The term was invented in 2006, and clickbait was the scourge of the internet for a few years, but by 2014 the cultural immune response was in full swing. The Onion launched ClickHole that year, Facebook started taking steps to squash clickbait, blah blah blah.

Now, no one reads clickbait because we’ve learned better. People are learning again. Hoel is worried that “Social media will ensure an endless culture war and internal social upheaval.” But we’re not worried, because soon we will develop cultural antibodies to the outrage economy, just like we developed cultural antibodies to clickbait, or to evolution vs. creationism debates, or to whatever was blowing up the internet in the 1990s (arguments about Microsoft?).

In fact, we’re already getting there. There was a time when we used to click on outrageous political stories. Now I think, “They’re rifting me”, and move on without clicking. No one has written the definitive piece on it yet, but “don’t read the news” is a meme that’s gaining steam. We hear it from our friends all the time. People are waking up to the fact that the news will do almost anything to raise your blood pressure, and that freaking out about “the issues of the day” does no one any good.

There will always be some new brainworm that we have to develop cultural antibodies to. And it might be fun to speculate which stupid argument will threaten to tear us apart next. But the outrage economy is on its way out, and the divisions of 2050 will look very, very different from the divisions we see today.

Identity and Anonymity Online

In the early days of the internet, everyone was anonymous — as the old saying goes, “on the Internet, nobody knows you’re a dog.”

But today the assumption is that everyone uses their real name. This is Mark Zuckerberg’s fault, for pushing real names on Facebook. “You have one identity,” he says in David Kirkpatrick’s 2010 book, The Facebook Effect. “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. Having two identities for yourself is an example of a lack of integrity.”

But this will be even more of a flash in the pan than fighting about politics online. Internet anonymity is already coming back into style (hello) and this trend will continue into the future. Most people will have a mix of public and private accounts, pseudonyms, alts, and pen names. As with many of our other “predictions”, this is pretty much true already — what will change is that there will eventually be widespread acknowledgement and acceptance.

This is also a regression to the historical mean. Public anonymity and pseudonymity is a long and esteemed tradition — just ask Voltaire, George Sand, Mark Twain, Lewis Carroll, George Orwell, or Dr. Seuss.

During the American revolution, practically everyone was using a pseudonym. Many of these guys were already famous public figures, and ALSO writing pseudonymous letters. They were having it both ways — they had alts! 

Alexander Hamilton, James Madison, and John Jay wrote The Federalist Papers under the name “Publius”. But Ben Franklin was the real master of this — his pseudonyms included not only Richard Saunders of Poor Richard’s Almanack, but also “Silence Dogood”, “Caelia Shortface”, “Martha Careful”, “Anthony Afterwit”, “Miss Alice Addertongue”, “Polly Baker”, and “Benevolus”. We shudder to think what he would have done with even a dial-up connection.

Advances in crypto, VR, AR, and social networking will splinter the web, not unite it. More virtual locations means more places for different identities to thrive — just like how your family group text is different from the Discord channel you have with your friends, is different from your reddit comments, is different from your LinkedIn profile, is different from the messages you send on tinder. 

Loss of the distinction between “Lowbrow” and “Highbrow”

In 2018, Kendrick Lamar’s DAMN. became the first rap album to win a Pulitzer. Before this, the prize had only ever gone to classical or jazz. In 2020, skateboarding was in the Olympics for the first time, and most of the medals went to teenagers. The game Hades from Supergiant Games recently won a Hugo Award. It’s the first video game to do so, but it’s unlikely to be the last. 

In fact, the Hugo Awards themselves may be a good example of this trend. Back in 1953 when the Hugos started, fantasy and science fiction (and everything nerdy) was fringe stuff, totally marginal. Today, comic book superheroes dominate the box office and Targaryens are household names. 

This trend shows every sign of continuing. Things that are fringe, lowbrow, and popular will continue getting more and more official recognition, to the point that we will eventually lose the distinction between lowbrow and highbrow art at all. Olympic fencing is already on the same plane as olympic surfing, and soon there will be no social difference between comic games like Untitled Goose Game by House House, and comic operas like Le nozze di Figaro by Wolfgang Amadeus Mozart. If that seems impossibly flippant, remember that Mozart once composed a canon in B-flat major called “Lick me in the ass”.

High culture

This is part of why we’re not concerned that people and culture will become boring — cultural forces are constantly driving bizarre, fringe works towards the mainstream, and this trend shows no signs of stopping. Among other things, this will be really good for social mobility.

Minorities as minorities

Saying that America will be a majority-minority country by 2050 is the wrong way of thinking about it. By 2050, we won’t think about minorities in the same way at all. We won’t hold on to this sense of minority-nonminority — we’ll give up the minority-nonminority idea in favor of something more specific.

The categories that are important now won’t be important in 30 years. Concepts that we take for granted — the idea of being Italian, or German, or even just European — didn’t exist until pretty recently. We expect a return to a sort of negative multiculturalism — everyone is sort of fighting with everyone else, like how all the cities on the Italian peninsula used to go at it without much of a sense of shared Italian identity.

Legacy media struggles to keep up, but race and gender already compete with minority identities like subculture and political leanings. Your identity comes from being a goth, a furry, by wearing hiking clothes to the office, by wearing a $1,200 Canada Goose jacket on the New York subway in October, by your favorite sports team, by the websites you frequent, by which author or podcaster you won’t shut up about, by which YouTubers you reference, by being a progressive or a libertarian or an ACAB commie. In many contexts, your status as one of these minorities already matters more than your race, gender, or even sexuality — and online, your meatspace traits barely matter at all. 

The True Uses of the Internet are not yet Known

Johannes Gutenberg invented the printing press in 1440, but Martin Luther didn’t publish his 95 theses until 1517. If it takes a new technology 77 years to come into its own strength, we shouldn’t be surprised.

There are a number of dates we could choose for the invention of the internet — the first ARPANET connections in 1969, the TCP/IP standard in 1982, or the first web pages in 1991. Maybe 1993 is the right choice, being the year of the first web browser, the invention of HTML, and Eternal September, though basic technologies like URLs and HTTP didn’t come until a few years later! 

If we do go with 1993, then 77 years later would be the nice round 2070. Maybe the modern world moves a little bit faster than the protestant reformation, but anyone who thinks the internet hasn’t lived up to expectations in terms of changing the world should wait a minute. The cutting-edge developments of the early 2020s will come to seem like Jacobus de Varagine’s Legenda aurea — which you’ve probably never heard of, that’s the point. We haven’t seen the internet’s real face yet.

9 thoughts on “Predictions for 2050

  1. Great writeup! 🙂

    I’ve been thinking for a while how nice it might be to go back to multigenerational families. It sounds like it’d be so much easier to raise kids when you can ask your parents to take care of your kids for a bit while you take a break and watch Netflix, go on a true date with your wife, go to the gym, etc.

    Of course, that assumes that you have nice parents and that the parents are nice to their stepdaughter/stepson, which has historically been a huge problem.

    Family wealth makes sense, too. My Dad I’d always saying “What good will getting an inheritance do you when you’re 50? Surely you need that money now, when you’re 30?” Makes sense to me!

    Like

  2. YD says:

    I don’t see why the distinction between lowbrow and highbrow will disappear. The current lowbrow things will become highbrow, but I expect there to always be new lowbrow things to replace it.

    In 2050 we (as in old people who were alive and commenting on blogs all the way back in 2022) will probably be complaining about how kids these days enjoy their newfangled _____, instead of the classics like Superman and Game of Thrones 😀

    Like

  3. osgat says:

    Some wild speculations here – harems?? – but permanent human presence in space looks reasonable. Not Mars however – it looks like many people don’t realize how far it is and how harsh the conditions there are. Space habitats are the most likely initial settlements, getting resources from Earth, the Moon, or near-earth asteroids

    Like

  4. SemiProPainter says:

    This was really interesting!

    However, I have one question. Why should the regression to the mean work like you think it should? I agree with you that the twentieth century was very unusual in some respects, but why should we necessarily revert to the historical mean? Likely the 21st century will also be very unusual, so why should we expect that a lot of the trends of the 20th century revert to the 19th century and before rather than towards a different mean?

    For instance, the “historical mean” of humanity is living in small groups with no writing and limited contact to anyone else. States are deeply unnatural in the context of that backdrop, yet no one seems to assert that states will naturally fade away. Why should we be able to use “regression to the mean” to assert other more recent trends?

    Anyways, thank you for posting.

    Like

    1. Some trends (like literacy rates) show no signs of stopping. High literacy rates are historically weird but don’t show any sign of going away, so they probably won’t. Other trends (like deaths from infectious diseases) did very unusual things in the 20th century and show signs of reversing. We think these trends are likely to continue toward historical baselines.

      Like

  5. Douglas Muth says:

    > Your identity comes from being a goth, a furry, by wearing hiking clothes to the office

    I’m a furry and I choked on my water when reading this line!

    Nicely done. 🙂

    Like

  6. Taj says:

    A nerdy note: “basic technologies like URLs and HTTP didn’t come until a few years later” is way off. They weren’t formally specified until a year or two after the first browsers, but the specifications were playing catch-up. Both technologies are fundamental to any functional browser.

    Like

  7. Colin McGlynn says:

    Very interesting! All of these predictions seem plausible to me except for AI. The level of improvement you predict feels like it is smaller than the difference between GTP2 and GTP3. Why would improvement slow down so much?

    Like

    1. Good question. Without getting too into it, a common pattern with AI research going back to the very beginning is finding it relatively easy to get to 90% performance but much harder to get to 95% and sometimes impossible to get to 100%. Depends on the domain of course, they did end up blowing chess out of the water.

      Like

Leave a comment